00:00:00.001 Started by upstream project "autotest-per-patch" build number 126210 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.015 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.016 The recommended git tool is: git 00:00:00.016 using credential 00000000-0000-0000-0000-000000000002 00:00:00.018 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.032 Fetching changes from the remote Git repository 00:00:00.035 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.055 Using shallow fetch with depth 1 00:00:00.055 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.055 > git --version # timeout=10 00:00:00.081 > git --version # 'git version 2.39.2' 00:00:00.081 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.111 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.111 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.225 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.236 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.246 Checking out Revision 7caca6989ac753a10259529aadac5754060382af (FETCH_HEAD) 00:00:02.246 > git config core.sparsecheckout # timeout=10 00:00:02.257 > git read-tree -mu HEAD # timeout=10 00:00:02.272 > git checkout -f 7caca6989ac753a10259529aadac5754060382af # timeout=5 00:00:02.291 Commit message: "jenkins/jjb-config: Purge centos leftovers" 00:00:02.291 > git rev-list --no-walk 7caca6989ac753a10259529aadac5754060382af # timeout=10 00:00:02.516 [Pipeline] Start of Pipeline 00:00:02.531 [Pipeline] library 00:00:02.533 Loading library shm_lib@master 00:00:02.534 Library shm_lib@master is cached. Copying from home. 00:00:02.549 [Pipeline] node 00:00:02.556 Running on CYP6 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.558 [Pipeline] { 00:00:02.567 [Pipeline] catchError 00:00:02.568 [Pipeline] { 00:00:02.583 [Pipeline] wrap 00:00:02.592 [Pipeline] { 00:00:02.601 [Pipeline] stage 00:00:02.602 [Pipeline] { (Prologue) 00:00:02.790 [Pipeline] sh 00:00:03.071 + logger -p user.info -t JENKINS-CI 00:00:03.090 [Pipeline] echo 00:00:03.092 Node: CYP6 00:00:03.099 [Pipeline] sh 00:00:03.399 [Pipeline] setCustomBuildProperty 00:00:03.408 [Pipeline] echo 00:00:03.409 Cleanup processes 00:00:03.415 [Pipeline] sh 00:00:03.699 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.699 2561716 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.716 [Pipeline] sh 00:00:04.007 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.007 ++ grep -v 'sudo pgrep' 00:00:04.007 ++ awk '{print $1}' 00:00:04.007 + sudo kill -9 00:00:04.007 + true 00:00:04.018 [Pipeline] cleanWs 00:00:04.026 [WS-CLEANUP] Deleting project workspace... 00:00:04.026 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.033 [WS-CLEANUP] done 00:00:04.036 [Pipeline] setCustomBuildProperty 00:00:04.047 [Pipeline] sh 00:00:04.325 + sudo git config --global --replace-all safe.directory '*' 00:00:04.391 [Pipeline] httpRequest 00:00:04.412 [Pipeline] echo 00:00:04.414 Sorcerer 10.211.164.101 is alive 00:00:04.425 [Pipeline] httpRequest 00:00:04.430 HttpMethod: GET 00:00:04.431 URL: http://10.211.164.101/packages/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:04.431 Sending request to url: http://10.211.164.101/packages/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:04.434 Response Code: HTTP/1.1 200 OK 00:00:04.434 Success: Status code 200 is in the accepted range: 200,404 00:00:04.435 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:05.158 [Pipeline] sh 00:00:05.440 + tar --no-same-owner -xf jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:05.457 [Pipeline] httpRequest 00:00:05.480 [Pipeline] echo 00:00:05.481 Sorcerer 10.211.164.101 is alive 00:00:05.487 [Pipeline] httpRequest 00:00:05.492 HttpMethod: GET 00:00:05.493 URL: http://10.211.164.101/packages/spdk_248c547d03bd63d26c50240ccfd7f3cfc99bc650.tar.gz 00:00:05.493 Sending request to url: http://10.211.164.101/packages/spdk_248c547d03bd63d26c50240ccfd7f3cfc99bc650.tar.gz 00:00:05.505 Response Code: HTTP/1.1 200 OK 00:00:05.506 Success: Status code 200 is in the accepted range: 200,404 00:00:05.506 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_248c547d03bd63d26c50240ccfd7f3cfc99bc650.tar.gz 00:00:45.776 [Pipeline] sh 00:00:46.061 + tar --no-same-owner -xf spdk_248c547d03bd63d26c50240ccfd7f3cfc99bc650.tar.gz 00:00:49.403 [Pipeline] sh 00:00:49.692 + git -C spdk log --oneline -n5 00:00:49.692 248c547d0 nvmf/tcp: add option for selecting a sock impl 00:00:49.692 2d30d9f83 accel: introduce tasks in sequence limit 00:00:49.692 2728651ee accel: adjust task per ch define name 00:00:49.692 e7cce062d Examples/Perf: correct the calculation of total bandwidth 00:00:49.692 3b4b1d00c libvfio-user: bump MAX_DMA_REGIONS 00:00:49.706 [Pipeline] } 00:00:49.725 [Pipeline] // stage 00:00:49.736 [Pipeline] stage 00:00:49.739 [Pipeline] { (Prepare) 00:00:49.763 [Pipeline] writeFile 00:00:49.781 [Pipeline] sh 00:00:50.068 + logger -p user.info -t JENKINS-CI 00:00:50.086 [Pipeline] sh 00:00:50.374 + logger -p user.info -t JENKINS-CI 00:00:50.388 [Pipeline] sh 00:00:50.673 + cat autorun-spdk.conf 00:00:50.673 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:50.673 SPDK_TEST_BLOCKDEV=1 00:00:50.673 SPDK_TEST_ISAL=1 00:00:50.673 SPDK_TEST_CRYPTO=1 00:00:50.673 SPDK_TEST_REDUCE=1 00:00:50.673 SPDK_TEST_VBDEV_COMPRESS=1 00:00:50.673 SPDK_RUN_UBSAN=1 00:00:50.682 RUN_NIGHTLY=0 00:00:50.687 [Pipeline] readFile 00:00:50.719 [Pipeline] withEnv 00:00:50.721 [Pipeline] { 00:00:50.741 [Pipeline] sh 00:00:51.030 + set -ex 00:00:51.030 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:51.030 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:51.030 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:51.030 ++ SPDK_TEST_BLOCKDEV=1 00:00:51.030 ++ SPDK_TEST_ISAL=1 00:00:51.030 ++ SPDK_TEST_CRYPTO=1 00:00:51.030 ++ SPDK_TEST_REDUCE=1 00:00:51.030 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:51.030 ++ SPDK_RUN_UBSAN=1 00:00:51.030 ++ RUN_NIGHTLY=0 00:00:51.030 + case $SPDK_TEST_NVMF_NICS in 00:00:51.030 + DRIVERS= 00:00:51.030 + [[ -n '' ]] 00:00:51.030 + exit 0 00:00:51.040 [Pipeline] } 00:00:51.061 [Pipeline] // withEnv 00:00:51.067 [Pipeline] } 00:00:51.086 [Pipeline] // stage 00:00:51.098 [Pipeline] catchError 00:00:51.100 [Pipeline] { 00:00:51.118 [Pipeline] timeout 00:00:51.118 Timeout set to expire in 40 min 00:00:51.120 [Pipeline] { 00:00:51.137 [Pipeline] stage 00:00:51.139 [Pipeline] { (Tests) 00:00:51.157 [Pipeline] sh 00:00:51.446 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:51.446 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:51.446 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:51.446 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:51.446 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:51.446 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:51.446 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:51.446 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:51.446 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:51.446 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:51.446 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:51.446 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:51.446 + source /etc/os-release 00:00:51.446 ++ NAME='Fedora Linux' 00:00:51.446 ++ VERSION='38 (Cloud Edition)' 00:00:51.446 ++ ID=fedora 00:00:51.446 ++ VERSION_ID=38 00:00:51.446 ++ VERSION_CODENAME= 00:00:51.446 ++ PLATFORM_ID=platform:f38 00:00:51.446 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:51.446 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:51.446 ++ LOGO=fedora-logo-icon 00:00:51.446 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:51.446 ++ HOME_URL=https://fedoraproject.org/ 00:00:51.446 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:51.446 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:51.446 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:51.446 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:51.446 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:51.446 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:51.446 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:51.446 ++ SUPPORT_END=2024-05-14 00:00:51.446 ++ VARIANT='Cloud Edition' 00:00:51.446 ++ VARIANT_ID=cloud 00:00:51.446 + uname -a 00:00:51.446 Linux spdk-CYP-06 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:51.446 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:54.761 Hugepages 00:00:54.761 node hugesize free / total 00:00:54.761 node0 1048576kB 0 / 0 00:00:54.761 node0 2048kB 0 / 0 00:00:54.761 node1 1048576kB 0 / 0 00:00:54.761 node1 2048kB 0 / 0 00:00:54.761 00:00:54.761 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:55.022 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:00:55.022 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:00:55.022 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:00:55.022 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:00:55.022 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:00:55.022 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:00:55.022 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:00:55.022 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:00:55.022 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:00:55.022 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:00:55.022 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:00:55.022 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:00:55.022 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:00:55.022 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:00:55.022 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:00:55.022 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:00:55.022 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:00:55.022 + rm -f /tmp/spdk-ld-path 00:00:55.022 + source autorun-spdk.conf 00:00:55.022 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:55.022 ++ SPDK_TEST_BLOCKDEV=1 00:00:55.022 ++ SPDK_TEST_ISAL=1 00:00:55.022 ++ SPDK_TEST_CRYPTO=1 00:00:55.022 ++ SPDK_TEST_REDUCE=1 00:00:55.022 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:55.022 ++ SPDK_RUN_UBSAN=1 00:00:55.022 ++ RUN_NIGHTLY=0 00:00:55.022 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:55.022 + [[ -n '' ]] 00:00:55.022 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:55.022 + for M in /var/spdk/build-*-manifest.txt 00:00:55.022 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:55.022 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:55.022 + for M in /var/spdk/build-*-manifest.txt 00:00:55.022 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:55.022 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:55.022 ++ uname 00:00:55.022 + [[ Linux == \L\i\n\u\x ]] 00:00:55.022 + sudo dmesg -T 00:00:55.022 + sudo dmesg --clear 00:00:55.284 + dmesg_pid=2562801 00:00:55.284 + [[ Fedora Linux == FreeBSD ]] 00:00:55.284 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:55.284 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:55.284 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:55.284 + [[ -x /usr/src/fio-static/fio ]] 00:00:55.284 + export FIO_BIN=/usr/src/fio-static/fio 00:00:55.284 + FIO_BIN=/usr/src/fio-static/fio 00:00:55.284 + sudo dmesg -Tw 00:00:55.284 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:55.284 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:55.284 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:55.284 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:55.284 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:55.284 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:55.284 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:55.284 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:55.284 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:55.284 Test configuration: 00:00:55.284 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:55.284 SPDK_TEST_BLOCKDEV=1 00:00:55.284 SPDK_TEST_ISAL=1 00:00:55.284 SPDK_TEST_CRYPTO=1 00:00:55.284 SPDK_TEST_REDUCE=1 00:00:55.284 SPDK_TEST_VBDEV_COMPRESS=1 00:00:55.284 SPDK_RUN_UBSAN=1 00:00:55.284 RUN_NIGHTLY=0 17:13:06 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:55.284 17:13:06 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:55.284 17:13:06 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:55.284 17:13:06 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:55.284 17:13:06 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:55.284 17:13:06 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:55.284 17:13:06 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:55.284 17:13:06 -- paths/export.sh@5 -- $ export PATH 00:00:55.284 17:13:06 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:55.284 17:13:06 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:55.284 17:13:06 -- common/autobuild_common.sh@444 -- $ date +%s 00:00:55.284 17:13:06 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721056386.XXXXXX 00:00:55.284 17:13:06 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721056386.muDfzG 00:00:55.284 17:13:06 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:00:55.284 17:13:06 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:00:55.284 17:13:06 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:55.284 17:13:06 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:55.284 17:13:06 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:55.284 17:13:06 -- common/autobuild_common.sh@460 -- $ get_config_params 00:00:55.284 17:13:06 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:00:55.284 17:13:06 -- common/autotest_common.sh@10 -- $ set +x 00:00:55.284 17:13:06 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:55.284 17:13:06 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:00:55.284 17:13:06 -- pm/common@17 -- $ local monitor 00:00:55.284 17:13:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:55.284 17:13:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:55.284 17:13:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:55.284 17:13:06 -- pm/common@21 -- $ date +%s 00:00:55.284 17:13:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:55.284 17:13:06 -- pm/common@25 -- $ sleep 1 00:00:55.284 17:13:06 -- pm/common@21 -- $ date +%s 00:00:55.284 17:13:06 -- pm/common@21 -- $ date +%s 00:00:55.284 17:13:06 -- pm/common@21 -- $ date +%s 00:00:55.284 17:13:06 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721056386 00:00:55.284 17:13:06 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721056386 00:00:55.284 17:13:06 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721056386 00:00:55.284 17:13:06 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721056386 00:00:55.284 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721056386_collect-vmstat.pm.log 00:00:55.284 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721056386_collect-cpu-load.pm.log 00:00:55.284 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721056386_collect-cpu-temp.pm.log 00:00:55.284 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721056386_collect-bmc-pm.bmc.pm.log 00:00:56.227 17:13:07 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:00:56.227 17:13:07 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:56.227 17:13:07 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:56.227 17:13:07 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:56.227 17:13:07 -- spdk/autobuild.sh@16 -- $ date -u 00:00:56.227 Mon Jul 15 03:13:07 PM UTC 2024 00:00:56.227 17:13:07 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:56.488 v24.09-pre-208-g248c547d0 00:00:56.488 17:13:07 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:56.488 17:13:07 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:56.488 17:13:07 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:56.488 17:13:07 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:00:56.488 17:13:07 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:00:56.488 17:13:07 -- common/autotest_common.sh@10 -- $ set +x 00:00:56.488 ************************************ 00:00:56.488 START TEST ubsan 00:00:56.488 ************************************ 00:00:56.488 17:13:07 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:00:56.488 using ubsan 00:00:56.488 00:00:56.488 real 0m0.001s 00:00:56.488 user 0m0.001s 00:00:56.488 sys 0m0.000s 00:00:56.488 17:13:07 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:00:56.488 17:13:07 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:56.488 ************************************ 00:00:56.488 END TEST ubsan 00:00:56.488 ************************************ 00:00:56.488 17:13:07 -- common/autotest_common.sh@1142 -- $ return 0 00:00:56.488 17:13:07 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:56.488 17:13:07 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:56.488 17:13:07 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:56.488 17:13:07 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:56.488 17:13:07 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:56.488 17:13:07 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:56.488 17:13:07 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:56.488 17:13:07 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:56.488 17:13:07 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:56.488 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:56.488 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:57.059 Using 'verbs' RDMA provider 00:01:12.960 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:25.192 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:25.453 Creating mk/config.mk...done. 00:01:25.453 Creating mk/cc.flags.mk...done. 00:01:25.453 Type 'make' to build. 00:01:25.453 17:13:36 -- spdk/autobuild.sh@69 -- $ run_test make make -j128 00:01:25.453 17:13:36 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:25.453 17:13:36 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:25.453 17:13:36 -- common/autotest_common.sh@10 -- $ set +x 00:01:25.714 ************************************ 00:01:25.714 START TEST make 00:01:25.714 ************************************ 00:01:25.714 17:13:36 make -- common/autotest_common.sh@1123 -- $ make -j128 00:01:25.974 make[1]: Nothing to be done for 'all'. 00:01:58.084 The Meson build system 00:01:58.084 Version: 1.3.1 00:01:58.084 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:01:58.084 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:01:58.084 Build type: native build 00:01:58.084 Program cat found: YES (/usr/bin/cat) 00:01:58.084 Project name: DPDK 00:01:58.084 Project version: 24.03.0 00:01:58.084 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:58.084 C linker for the host machine: cc ld.bfd 2.39-16 00:01:58.084 Host machine cpu family: x86_64 00:01:58.084 Host machine cpu: x86_64 00:01:58.084 Message: ## Building in Developer Mode ## 00:01:58.084 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:58.084 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:58.084 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:58.084 Program python3 found: YES (/usr/bin/python3) 00:01:58.084 Program cat found: YES (/usr/bin/cat) 00:01:58.084 Compiler for C supports arguments -march=native: YES 00:01:58.084 Checking for size of "void *" : 8 00:01:58.084 Checking for size of "void *" : 8 (cached) 00:01:58.084 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:01:58.084 Library m found: YES 00:01:58.084 Library numa found: YES 00:01:58.084 Has header "numaif.h" : YES 00:01:58.084 Library fdt found: NO 00:01:58.084 Library execinfo found: NO 00:01:58.084 Has header "execinfo.h" : YES 00:01:58.084 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:58.084 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:58.084 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:58.084 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:58.084 Run-time dependency openssl found: YES 3.0.9 00:01:58.084 Run-time dependency libpcap found: YES 1.10.4 00:01:58.084 Has header "pcap.h" with dependency libpcap: YES 00:01:58.084 Compiler for C supports arguments -Wcast-qual: YES 00:01:58.084 Compiler for C supports arguments -Wdeprecated: YES 00:01:58.084 Compiler for C supports arguments -Wformat: YES 00:01:58.084 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:58.084 Compiler for C supports arguments -Wformat-security: NO 00:01:58.084 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:58.084 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:58.084 Compiler for C supports arguments -Wnested-externs: YES 00:01:58.084 Compiler for C supports arguments -Wold-style-definition: YES 00:01:58.084 Compiler for C supports arguments -Wpointer-arith: YES 00:01:58.084 Compiler for C supports arguments -Wsign-compare: YES 00:01:58.084 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:58.084 Compiler for C supports arguments -Wundef: YES 00:01:58.084 Compiler for C supports arguments -Wwrite-strings: YES 00:01:58.084 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:58.084 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:58.084 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:58.084 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:58.084 Program objdump found: YES (/usr/bin/objdump) 00:01:58.084 Compiler for C supports arguments -mavx512f: YES 00:01:58.084 Checking if "AVX512 checking" compiles: YES 00:01:58.084 Fetching value of define "__SSE4_2__" : 1 00:01:58.084 Fetching value of define "__AES__" : 1 00:01:58.084 Fetching value of define "__AVX__" : 1 00:01:58.084 Fetching value of define "__AVX2__" : 1 00:01:58.084 Fetching value of define "__AVX512BW__" : 1 00:01:58.084 Fetching value of define "__AVX512CD__" : 1 00:01:58.084 Fetching value of define "__AVX512DQ__" : 1 00:01:58.084 Fetching value of define "__AVX512F__" : 1 00:01:58.084 Fetching value of define "__AVX512VL__" : 1 00:01:58.084 Fetching value of define "__PCLMUL__" : 1 00:01:58.084 Fetching value of define "__RDRND__" : 1 00:01:58.084 Fetching value of define "__RDSEED__" : 1 00:01:58.084 Fetching value of define "__VPCLMULQDQ__" : 1 00:01:58.084 Fetching value of define "__znver1__" : (undefined) 00:01:58.084 Fetching value of define "__znver2__" : (undefined) 00:01:58.084 Fetching value of define "__znver3__" : (undefined) 00:01:58.084 Fetching value of define "__znver4__" : (undefined) 00:01:58.084 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:58.084 Message: lib/log: Defining dependency "log" 00:01:58.084 Message: lib/kvargs: Defining dependency "kvargs" 00:01:58.084 Message: lib/telemetry: Defining dependency "telemetry" 00:01:58.084 Checking for function "getentropy" : NO 00:01:58.084 Message: lib/eal: Defining dependency "eal" 00:01:58.084 Message: lib/ring: Defining dependency "ring" 00:01:58.085 Message: lib/rcu: Defining dependency "rcu" 00:01:58.085 Message: lib/mempool: Defining dependency "mempool" 00:01:58.085 Message: lib/mbuf: Defining dependency "mbuf" 00:01:58.085 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:58.085 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:58.085 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:58.085 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:58.085 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:58.085 Fetching value of define "__VPCLMULQDQ__" : 1 (cached) 00:01:58.085 Compiler for C supports arguments -mpclmul: YES 00:01:58.085 Compiler for C supports arguments -maes: YES 00:01:58.085 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:58.085 Compiler for C supports arguments -mavx512bw: YES 00:01:58.085 Compiler for C supports arguments -mavx512dq: YES 00:01:58.085 Compiler for C supports arguments -mavx512vl: YES 00:01:58.085 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:58.085 Compiler for C supports arguments -mavx2: YES 00:01:58.085 Compiler for C supports arguments -mavx: YES 00:01:58.085 Message: lib/net: Defining dependency "net" 00:01:58.085 Message: lib/meter: Defining dependency "meter" 00:01:58.085 Message: lib/ethdev: Defining dependency "ethdev" 00:01:58.085 Message: lib/pci: Defining dependency "pci" 00:01:58.085 Message: lib/cmdline: Defining dependency "cmdline" 00:01:58.085 Message: lib/hash: Defining dependency "hash" 00:01:58.085 Message: lib/timer: Defining dependency "timer" 00:01:58.085 Message: lib/compressdev: Defining dependency "compressdev" 00:01:58.085 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:58.085 Message: lib/dmadev: Defining dependency "dmadev" 00:01:58.085 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:58.085 Message: lib/power: Defining dependency "power" 00:01:58.085 Message: lib/reorder: Defining dependency "reorder" 00:01:58.085 Message: lib/security: Defining dependency "security" 00:01:58.085 Has header "linux/userfaultfd.h" : YES 00:01:58.085 Has header "linux/vduse.h" : YES 00:01:58.085 Message: lib/vhost: Defining dependency "vhost" 00:01:58.085 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:58.085 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:01:58.085 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:58.085 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:58.085 Compiler for C supports arguments -std=c11: YES 00:01:58.085 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:01:58.085 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:01:58.085 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:01:58.085 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:01:58.085 Run-time dependency libmlx5 found: YES 1.24.44.0 00:01:58.085 Run-time dependency libibverbs found: YES 1.14.44.0 00:01:58.085 Library mtcr_ul found: NO 00:01:58.085 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:01:58.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:01:58.347 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:01:58.348 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:01:58.348 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:01:58.348 Configuring mlx5_autoconf.h using configuration 00:01:58.348 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:01:58.348 Run-time dependency libcrypto found: YES 3.0.9 00:01:58.348 Library IPSec_MB found: YES 00:01:58.348 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:01:58.348 Message: drivers/common/qat: Defining dependency "common_qat" 00:01:58.348 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:58.348 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:58.348 Library IPSec_MB found: YES 00:01:58.348 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:01:58.348 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:01:58.348 Compiler for C supports arguments -std=c11: YES (cached) 00:01:58.348 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:58.348 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:58.348 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:58.348 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:58.348 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:01:58.348 Run-time dependency libisal found: NO (tried pkgconfig) 00:01:58.348 Library libisal found: NO 00:01:58.348 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:01:58.348 Compiler for C supports arguments -std=c11: YES (cached) 00:01:58.348 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:58.348 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:58.348 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:58.348 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:58.348 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:01:58.348 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:58.348 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:58.348 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:58.348 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:58.348 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:58.348 Program doxygen found: YES (/usr/bin/doxygen) 00:01:58.348 Configuring doxy-api-html.conf using configuration 00:01:58.348 Configuring doxy-api-man.conf using configuration 00:01:58.348 Program mandb found: YES (/usr/bin/mandb) 00:01:58.348 Program sphinx-build found: NO 00:01:58.348 Configuring rte_build_config.h using configuration 00:01:58.348 Message: 00:01:58.348 ================= 00:01:58.348 Applications Enabled 00:01:58.348 ================= 00:01:58.348 00:01:58.348 apps: 00:01:58.348 00:01:58.348 00:01:58.348 Message: 00:01:58.348 ================= 00:01:58.348 Libraries Enabled 00:01:58.348 ================= 00:01:58.348 00:01:58.348 libs: 00:01:58.348 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:58.348 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:58.348 cryptodev, dmadev, power, reorder, security, vhost, 00:01:58.348 00:01:58.348 Message: 00:01:58.348 =============== 00:01:58.348 Drivers Enabled 00:01:58.348 =============== 00:01:58.348 00:01:58.348 common: 00:01:58.348 mlx5, qat, 00:01:58.348 bus: 00:01:58.348 auxiliary, pci, vdev, 00:01:58.348 mempool: 00:01:58.348 ring, 00:01:58.348 dma: 00:01:58.348 00:01:58.348 net: 00:01:58.348 00:01:58.348 crypto: 00:01:58.348 ipsec_mb, mlx5, 00:01:58.348 compress: 00:01:58.348 isal, mlx5, 00:01:58.348 vdpa: 00:01:58.348 00:01:58.348 00:01:58.348 Message: 00:01:58.348 ================= 00:01:58.348 Content Skipped 00:01:58.348 ================= 00:01:58.348 00:01:58.348 apps: 00:01:58.348 dumpcap: explicitly disabled via build config 00:01:58.348 graph: explicitly disabled via build config 00:01:58.348 pdump: explicitly disabled via build config 00:01:58.348 proc-info: explicitly disabled via build config 00:01:58.348 test-acl: explicitly disabled via build config 00:01:58.348 test-bbdev: explicitly disabled via build config 00:01:58.348 test-cmdline: explicitly disabled via build config 00:01:58.348 test-compress-perf: explicitly disabled via build config 00:01:58.348 test-crypto-perf: explicitly disabled via build config 00:01:58.348 test-dma-perf: explicitly disabled via build config 00:01:58.348 test-eventdev: explicitly disabled via build config 00:01:58.348 test-fib: explicitly disabled via build config 00:01:58.348 test-flow-perf: explicitly disabled via build config 00:01:58.348 test-gpudev: explicitly disabled via build config 00:01:58.348 test-mldev: explicitly disabled via build config 00:01:58.348 test-pipeline: explicitly disabled via build config 00:01:58.348 test-pmd: explicitly disabled via build config 00:01:58.348 test-regex: explicitly disabled via build config 00:01:58.348 test-sad: explicitly disabled via build config 00:01:58.348 test-security-perf: explicitly disabled via build config 00:01:58.349 00:01:58.349 libs: 00:01:58.349 argparse: explicitly disabled via build config 00:01:58.349 metrics: explicitly disabled via build config 00:01:58.349 acl: explicitly disabled via build config 00:01:58.349 bbdev: explicitly disabled via build config 00:01:58.349 bitratestats: explicitly disabled via build config 00:01:58.349 bpf: explicitly disabled via build config 00:01:58.349 cfgfile: explicitly disabled via build config 00:01:58.349 distributor: explicitly disabled via build config 00:01:58.349 efd: explicitly disabled via build config 00:01:58.349 eventdev: explicitly disabled via build config 00:01:58.349 dispatcher: explicitly disabled via build config 00:01:58.349 gpudev: explicitly disabled via build config 00:01:58.349 gro: explicitly disabled via build config 00:01:58.349 gso: explicitly disabled via build config 00:01:58.349 ip_frag: explicitly disabled via build config 00:01:58.349 jobstats: explicitly disabled via build config 00:01:58.349 latencystats: explicitly disabled via build config 00:01:58.349 lpm: explicitly disabled via build config 00:01:58.349 member: explicitly disabled via build config 00:01:58.349 pcapng: explicitly disabled via build config 00:01:58.349 rawdev: explicitly disabled via build config 00:01:58.349 regexdev: explicitly disabled via build config 00:01:58.349 mldev: explicitly disabled via build config 00:01:58.349 rib: explicitly disabled via build config 00:01:58.349 sched: explicitly disabled via build config 00:01:58.349 stack: explicitly disabled via build config 00:01:58.349 ipsec: explicitly disabled via build config 00:01:58.349 pdcp: explicitly disabled via build config 00:01:58.349 fib: explicitly disabled via build config 00:01:58.349 port: explicitly disabled via build config 00:01:58.349 pdump: explicitly disabled via build config 00:01:58.349 table: explicitly disabled via build config 00:01:58.349 pipeline: explicitly disabled via build config 00:01:58.349 graph: explicitly disabled via build config 00:01:58.349 node: explicitly disabled via build config 00:01:58.349 00:01:58.349 drivers: 00:01:58.349 common/cpt: not in enabled drivers build config 00:01:58.349 common/dpaax: not in enabled drivers build config 00:01:58.349 common/iavf: not in enabled drivers build config 00:01:58.349 common/idpf: not in enabled drivers build config 00:01:58.349 common/ionic: not in enabled drivers build config 00:01:58.349 common/mvep: not in enabled drivers build config 00:01:58.349 common/octeontx: not in enabled drivers build config 00:01:58.349 bus/cdx: not in enabled drivers build config 00:01:58.349 bus/dpaa: not in enabled drivers build config 00:01:58.349 bus/fslmc: not in enabled drivers build config 00:01:58.349 bus/ifpga: not in enabled drivers build config 00:01:58.349 bus/platform: not in enabled drivers build config 00:01:58.349 bus/uacce: not in enabled drivers build config 00:01:58.349 bus/vmbus: not in enabled drivers build config 00:01:58.349 common/cnxk: not in enabled drivers build config 00:01:58.349 common/nfp: not in enabled drivers build config 00:01:58.349 common/nitrox: not in enabled drivers build config 00:01:58.349 common/sfc_efx: not in enabled drivers build config 00:01:58.349 mempool/bucket: not in enabled drivers build config 00:01:58.349 mempool/cnxk: not in enabled drivers build config 00:01:58.349 mempool/dpaa: not in enabled drivers build config 00:01:58.349 mempool/dpaa2: not in enabled drivers build config 00:01:58.349 mempool/octeontx: not in enabled drivers build config 00:01:58.349 mempool/stack: not in enabled drivers build config 00:01:58.349 dma/cnxk: not in enabled drivers build config 00:01:58.349 dma/dpaa: not in enabled drivers build config 00:01:58.349 dma/dpaa2: not in enabled drivers build config 00:01:58.349 dma/hisilicon: not in enabled drivers build config 00:01:58.349 dma/idxd: not in enabled drivers build config 00:01:58.349 dma/ioat: not in enabled drivers build config 00:01:58.349 dma/skeleton: not in enabled drivers build config 00:01:58.349 net/af_packet: not in enabled drivers build config 00:01:58.349 net/af_xdp: not in enabled drivers build config 00:01:58.349 net/ark: not in enabled drivers build config 00:01:58.349 net/atlantic: not in enabled drivers build config 00:01:58.349 net/avp: not in enabled drivers build config 00:01:58.349 net/axgbe: not in enabled drivers build config 00:01:58.349 net/bnx2x: not in enabled drivers build config 00:01:58.349 net/bnxt: not in enabled drivers build config 00:01:58.349 net/bonding: not in enabled drivers build config 00:01:58.349 net/cnxk: not in enabled drivers build config 00:01:58.349 net/cpfl: not in enabled drivers build config 00:01:58.349 net/cxgbe: not in enabled drivers build config 00:01:58.349 net/dpaa: not in enabled drivers build config 00:01:58.349 net/dpaa2: not in enabled drivers build config 00:01:58.349 net/e1000: not in enabled drivers build config 00:01:58.349 net/ena: not in enabled drivers build config 00:01:58.349 net/enetc: not in enabled drivers build config 00:01:58.349 net/enetfec: not in enabled drivers build config 00:01:58.349 net/enic: not in enabled drivers build config 00:01:58.349 net/failsafe: not in enabled drivers build config 00:01:58.349 net/fm10k: not in enabled drivers build config 00:01:58.349 net/gve: not in enabled drivers build config 00:01:58.349 net/hinic: not in enabled drivers build config 00:01:58.349 net/hns3: not in enabled drivers build config 00:01:58.349 net/i40e: not in enabled drivers build config 00:01:58.349 net/iavf: not in enabled drivers build config 00:01:58.349 net/ice: not in enabled drivers build config 00:01:58.349 net/idpf: not in enabled drivers build config 00:01:58.349 net/igc: not in enabled drivers build config 00:01:58.349 net/ionic: not in enabled drivers build config 00:01:58.349 net/ipn3ke: not in enabled drivers build config 00:01:58.349 net/ixgbe: not in enabled drivers build config 00:01:58.349 net/mana: not in enabled drivers build config 00:01:58.349 net/memif: not in enabled drivers build config 00:01:58.349 net/mlx4: not in enabled drivers build config 00:01:58.349 net/mlx5: not in enabled drivers build config 00:01:58.349 net/mvneta: not in enabled drivers build config 00:01:58.349 net/mvpp2: not in enabled drivers build config 00:01:58.349 net/netvsc: not in enabled drivers build config 00:01:58.349 net/nfb: not in enabled drivers build config 00:01:58.349 net/nfp: not in enabled drivers build config 00:01:58.349 net/ngbe: not in enabled drivers build config 00:01:58.349 net/null: not in enabled drivers build config 00:01:58.349 net/octeontx: not in enabled drivers build config 00:01:58.349 net/octeon_ep: not in enabled drivers build config 00:01:58.349 net/pcap: not in enabled drivers build config 00:01:58.349 net/pfe: not in enabled drivers build config 00:01:58.349 net/qede: not in enabled drivers build config 00:01:58.349 net/ring: not in enabled drivers build config 00:01:58.349 net/sfc: not in enabled drivers build config 00:01:58.349 net/softnic: not in enabled drivers build config 00:01:58.349 net/tap: not in enabled drivers build config 00:01:58.349 net/thunderx: not in enabled drivers build config 00:01:58.349 net/txgbe: not in enabled drivers build config 00:01:58.349 net/vdev_netvsc: not in enabled drivers build config 00:01:58.349 net/vhost: not in enabled drivers build config 00:01:58.349 net/virtio: not in enabled drivers build config 00:01:58.349 net/vmxnet3: not in enabled drivers build config 00:01:58.349 raw/*: missing internal dependency, "rawdev" 00:01:58.349 crypto/armv8: not in enabled drivers build config 00:01:58.349 crypto/bcmfs: not in enabled drivers build config 00:01:58.349 crypto/caam_jr: not in enabled drivers build config 00:01:58.349 crypto/ccp: not in enabled drivers build config 00:01:58.349 crypto/cnxk: not in enabled drivers build config 00:01:58.349 crypto/dpaa_sec: not in enabled drivers build config 00:01:58.349 crypto/dpaa2_sec: not in enabled drivers build config 00:01:58.349 crypto/mvsam: not in enabled drivers build config 00:01:58.349 crypto/nitrox: not in enabled drivers build config 00:01:58.349 crypto/null: not in enabled drivers build config 00:01:58.349 crypto/octeontx: not in enabled drivers build config 00:01:58.349 crypto/openssl: not in enabled drivers build config 00:01:58.349 crypto/scheduler: not in enabled drivers build config 00:01:58.349 crypto/uadk: not in enabled drivers build config 00:01:58.349 crypto/virtio: not in enabled drivers build config 00:01:58.349 compress/nitrox: not in enabled drivers build config 00:01:58.349 compress/octeontx: not in enabled drivers build config 00:01:58.349 compress/zlib: not in enabled drivers build config 00:01:58.349 regex/*: missing internal dependency, "regexdev" 00:01:58.349 ml/*: missing internal dependency, "mldev" 00:01:58.349 vdpa/ifc: not in enabled drivers build config 00:01:58.349 vdpa/mlx5: not in enabled drivers build config 00:01:58.349 vdpa/nfp: not in enabled drivers build config 00:01:58.349 vdpa/sfc: not in enabled drivers build config 00:01:58.349 event/*: missing internal dependency, "eventdev" 00:01:58.349 baseband/*: missing internal dependency, "bbdev" 00:01:58.349 gpu/*: missing internal dependency, "gpudev" 00:01:58.349 00:01:58.349 00:01:58.920 Build targets in project: 114 00:01:58.920 00:01:58.920 DPDK 24.03.0 00:01:58.920 00:01:58.920 User defined options 00:01:58.920 buildtype : debug 00:01:58.920 default_library : shared 00:01:58.920 libdir : lib 00:01:58.920 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:58.920 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:01:58.920 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:01:58.920 cpu_instruction_set: native 00:01:58.920 disable_apps : proc-info,test-fib,graph,test-dma-perf,test-mldev,test,test-regex,dumpcap,test-cmdline,test-acl,test-pipeline,test-flow-perf,pdump,test-sad,test-gpudev,test-security-perf,test-crypto-perf,test-bbdev,test-pmd,test-compress-perf,test-eventdev 00:01:58.920 disable_libs : bbdev,fib,dispatcher,distributor,bpf,latencystats,graph,mldev,efd,eventdev,gso,gpudev,acl,pipeline,stack,jobstats,ipsec,argparse,rib,pdcp,table,pdump,cfgfile,gro,pcapng,bitratestats,ip_frag,member,sched,node,port,metrics,lpm,regexdev,rawdev 00:01:58.920 enable_docs : false 00:01:58.920 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:01:58.920 enable_kmods : false 00:01:58.920 max_lcores : 128 00:01:58.920 tests : false 00:01:58.920 00:01:58.920 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:59.497 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:01:59.771 [1/377] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:59.771 [2/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:59.771 [3/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:59.771 [4/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:59.771 [5/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:59.771 [6/377] Linking static target lib/librte_kvargs.a 00:01:59.771 [7/377] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:59.771 [8/377] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:59.771 [9/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:59.771 [10/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:59.771 [11/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:59.771 [12/377] Linking static target lib/librte_log.a 00:01:59.771 [13/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:59.771 [14/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:59.771 [15/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:59.771 [16/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:00.040 [17/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:00.040 [18/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:00.040 [19/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:00.040 [20/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:00.040 [21/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:00.040 [22/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:00.040 [23/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:00.040 [24/377] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:00.040 [25/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:00.040 [26/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:00.040 [27/377] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:00.040 [28/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:00.040 [29/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:00.040 [30/377] Linking static target lib/librte_pci.a 00:02:00.040 [31/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:00.305 [32/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:00.305 [33/377] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:00.305 [34/377] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:00.305 [35/377] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:00.566 [36/377] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:00.566 [37/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:00.566 [38/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:00.567 [39/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:00.567 [40/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:00.567 [41/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:00.567 [42/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:00.567 [43/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:00.567 [44/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:00.567 [45/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:00.567 [46/377] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:00.567 [47/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:00.567 [48/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:00.567 [49/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:00.567 [50/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:00.567 [51/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:00.567 [52/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:00.567 [53/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:00.567 [54/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:00.567 [55/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:00.567 [56/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:00.567 [57/377] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:00.567 [58/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:00.567 [59/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:00.567 [60/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:00.567 [61/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:00.567 [62/377] Compiling C object lib/librte_net.a.p/net_net_crc_avx512.c.o 00:02:00.567 [63/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:00.567 [64/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:00.567 [65/377] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:00.567 [66/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:00.837 [67/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:00.837 [68/377] Linking static target lib/librte_telemetry.a 00:02:00.837 [69/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:00.837 [70/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:00.837 [71/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:00.837 [72/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:00.837 [73/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:00.837 [74/377] Linking static target lib/librte_meter.a 00:02:00.837 [75/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:00.837 [76/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:00.837 [77/377] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:00.837 [78/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:00.837 [79/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:00.837 [80/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:00.837 [81/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:00.837 [82/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:00.837 [83/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:00.837 [84/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:00.837 [85/377] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:00.837 [86/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:00.837 [87/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:00.837 [88/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:00.837 [89/377] Linking static target lib/librte_ring.a 00:02:00.837 [90/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:00.837 [91/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:00.837 [92/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:00.837 [93/377] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:00.837 [94/377] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:00.837 [95/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:00.837 [96/377] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:00.837 [97/377] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.837 [98/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:00.837 [99/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:00.837 [100/377] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:00.837 [101/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:00.837 [102/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:00.837 [103/377] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:00.837 [104/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:00.837 [105/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:00.837 [106/377] Linking static target lib/librte_timer.a 00:02:00.837 [107/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:00.837 [108/377] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:00.837 [109/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:01.102 [110/377] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.102 [111/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:01.102 [112/377] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:01.102 [113/377] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:01.102 [114/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:01.102 [115/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:01.102 [116/377] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:01.102 [117/377] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:01.102 [118/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:01.102 [119/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:01.102 [120/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:01.102 [121/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:01.102 [122/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:01.102 [123/377] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:01.102 [124/377] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:01.102 [125/377] Linking static target lib/librte_dmadev.a 00:02:01.102 [126/377] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:01.102 [127/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:01.102 [128/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:01.102 [129/377] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:01.102 [130/377] Linking static target lib/librte_mempool.a 00:02:01.102 [131/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:01.102 [132/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:01.102 [133/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:01.102 [134/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:01.102 [135/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:01.102 [136/377] Linking static target lib/librte_rcu.a 00:02:01.102 [137/377] Linking static target lib/librte_net.a 00:02:01.102 [138/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:01.102 [139/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:01.102 [140/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:01.102 [141/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:01.102 [142/377] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:01.102 [143/377] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:01.102 [144/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:01.102 [145/377] Linking static target lib/librte_reorder.a 00:02:01.360 [146/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:01.360 [147/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:01.360 [148/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:01.360 [149/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:01.360 [150/377] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:01.360 [151/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:01.360 [152/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:01.360 [153/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:01.360 [154/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:01.360 [155/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:01.360 [156/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:01.360 [157/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:01.360 [158/377] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.360 [159/377] Linking static target lib/librte_mbuf.a 00:02:01.360 [160/377] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:01.360 [161/377] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:01.360 [162/377] Linking static target lib/librte_security.a 00:02:01.360 [163/377] Linking static target lib/librte_power.a 00:02:01.360 [164/377] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:01.360 [165/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:01.360 [166/377] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.360 [167/377] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.360 [168/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:01.360 [169/377] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:01.360 [170/377] Linking static target lib/librte_cryptodev.a 00:02:01.360 [171/377] Linking target lib/librte_log.so.24.1 00:02:01.360 [172/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:01.360 [173/377] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:01.360 [174/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:01.360 [175/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:01.360 [176/377] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.360 [177/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:01.360 [178/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:01.360 [179/377] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:01.360 [180/377] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.360 [181/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:01.621 [182/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:01.621 [183/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:01.621 [184/377] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.621 [185/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:01.621 [186/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:01.621 [187/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:01.621 [188/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:01.621 [189/377] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:01.621 [190/377] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:01.621 [191/377] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.621 [192/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:01.621 [193/377] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:01.621 [194/377] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:01.621 [195/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:01.621 [196/377] Linking static target lib/librte_hash.a 00:02:01.621 [197/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:01.621 [198/377] Linking static target drivers/librte_bus_auxiliary.a 00:02:01.621 [199/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:01.621 [200/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:01.621 [201/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:01.621 [202/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:01.621 [203/377] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:01.621 [204/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:01.621 [205/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:01.621 [206/377] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.621 [207/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:01.621 [208/377] Linking static target lib/librte_eal.a 00:02:01.621 [209/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:01.621 [210/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:01.621 [211/377] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:01.621 [212/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:01.621 [213/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:01.621 [214/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:01.621 [215/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:01.621 [216/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:01.621 [217/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:01.621 [218/377] Linking target lib/librte_telemetry.so.24.1 00:02:01.621 [219/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:01.621 [220/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:01.621 [221/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:01.621 [222/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:01.621 [223/377] Linking target lib/librte_kvargs.so.24.1 00:02:01.621 [224/377] Linking static target lib/librte_compressdev.a 00:02:01.621 [225/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:01.621 [226/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:01.621 [227/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:01.621 [228/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:01.621 [229/377] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:01.621 [230/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:01.621 [231/377] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:01.621 [232/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:01.621 [233/377] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:01.621 [234/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:01.621 [235/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:01.621 [236/377] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:01.621 [237/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:01.621 [238/377] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.621 [239/377] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:01.621 [240/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:01.621 [241/377] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:01.621 [242/377] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:01.621 [243/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:01.621 [244/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:01.621 [245/377] Linking static target drivers/librte_bus_vdev.a 00:02:01.621 [246/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:01.621 [247/377] Linking static target lib/librte_cmdline.a 00:02:01.621 [248/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:01.879 [249/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:01.879 [250/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:01.879 [251/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:01.879 [252/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:01.879 [253/377] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.879 [254/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:01.879 [255/377] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:01.879 [256/377] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:01.879 [257/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:01.879 [258/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:01.879 [259/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:01.879 [260/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:01.879 [261/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:01.879 [262/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:01.879 [263/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:01.879 [264/377] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:01.879 [265/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:01.879 [266/377] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:01.879 [267/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:01.879 [268/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:01.880 [269/377] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.880 [270/377] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:01.880 [271/377] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:01.880 [272/377] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.880 [273/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:01.880 [274/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:01.880 [275/377] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:01.880 [276/377] Linking static target drivers/librte_mempool_ring.a 00:02:01.880 [277/377] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:01.880 [278/377] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:01.880 [279/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:01.880 [280/377] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:01.880 [281/377] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:01.880 [282/377] Linking static target drivers/librte_compress_mlx5.a 00:02:01.880 [283/377] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:01.880 [284/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:02.139 [285/377] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.139 [286/377] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:02.139 [287/377] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:02.139 [288/377] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:02.139 [289/377] Linking static target drivers/librte_compress_isal.a 00:02:02.139 [290/377] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.139 [291/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:02.139 [292/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:02.139 [293/377] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:02.139 [294/377] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:02.139 [295/377] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:02.140 [296/377] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:02.140 [297/377] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:02.140 [298/377] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:02.140 [299/377] Linking static target drivers/librte_bus_pci.a 00:02:02.140 [300/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:02.140 [301/377] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:02.140 [302/377] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:02.140 [303/377] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:02.140 [304/377] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:02.140 [305/377] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.140 [306/377] Linking static target lib/librte_ethdev.a 00:02:02.140 [307/377] Linking static target drivers/librte_crypto_mlx5.a 00:02:02.140 [308/377] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:02.398 [309/377] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:02.398 [310/377] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.398 [311/377] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:02.398 [312/377] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:02.398 [313/377] Linking static target drivers/librte_common_mlx5.a 00:02:02.398 [314/377] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.658 [315/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:02.658 [316/377] Linking static target drivers/libtmp_rte_common_qat.a 00:02:02.918 [317/377] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:02.918 [318/377] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:02.918 [319/377] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:02.918 [320/377] Linking static target drivers/librte_common_qat.a 00:02:02.918 [321/377] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.178 [322/377] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.439 [323/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:03.439 [324/377] Linking static target lib/librte_vhost.a 00:02:03.439 [325/377] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.048 [326/377] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.590 [327/377] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.889 [328/377] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:13.797 [329/377] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:13.797 [330/377] Linking target lib/librte_eal.so.24.1 00:02:13.797 [331/377] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:13.797 [332/377] Linking target lib/librte_ring.so.24.1 00:02:13.797 [333/377] Linking target lib/librte_dmadev.so.24.1 00:02:13.797 [334/377] Linking target lib/librte_pci.so.24.1 00:02:13.797 [335/377] Linking target lib/librte_meter.so.24.1 00:02:13.797 [336/377] Linking target lib/librte_timer.so.24.1 00:02:13.797 [337/377] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:13.797 [338/377] Linking target drivers/librte_bus_vdev.so.24.1 00:02:13.797 [339/377] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:13.797 [340/377] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:13.797 [341/377] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:13.797 [342/377] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:13.797 [343/377] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:13.797 [344/377] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:13.797 [345/377] Linking target lib/librte_rcu.so.24.1 00:02:13.797 [346/377] Linking target lib/librte_mempool.so.24.1 00:02:14.084 [347/377] Linking target drivers/librte_bus_pci.so.24.1 00:02:14.084 [348/377] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:14.084 [349/377] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:14.084 [350/377] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:14.084 [351/377] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:14.084 [352/377] Linking target lib/librte_mbuf.so.24.1 00:02:14.084 [353/377] Linking target drivers/librte_mempool_ring.so.24.1 00:02:14.345 [354/377] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:14.345 [355/377] Linking target lib/librte_cryptodev.so.24.1 00:02:14.345 [356/377] Linking target lib/librte_compressdev.so.24.1 00:02:14.345 [357/377] Linking target lib/librte_net.so.24.1 00:02:14.345 [358/377] Linking target lib/librte_reorder.so.24.1 00:02:14.605 [359/377] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:14.605 [360/377] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:14.605 [361/377] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:14.605 [362/377] Linking target lib/librte_hash.so.24.1 00:02:14.605 [363/377] Linking target lib/librte_cmdline.so.24.1 00:02:14.605 [364/377] Linking target lib/librte_security.so.24.1 00:02:14.605 [365/377] Linking target drivers/librte_compress_isal.so.24.1 00:02:14.605 [366/377] Linking target lib/librte_ethdev.so.24.1 00:02:14.605 [367/377] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:14.605 [368/377] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:14.864 [369/377] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:14.864 [370/377] Linking target drivers/librte_common_mlx5.so.24.1 00:02:14.864 [371/377] Linking target lib/librte_power.so.24.1 00:02:14.864 [372/377] Linking target lib/librte_vhost.so.24.1 00:02:14.864 [373/377] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:15.123 [374/377] Linking target drivers/librte_common_qat.so.24.1 00:02:15.123 [375/377] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:15.123 [376/377] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:15.383 [377/377] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:15.383 INFO: autodetecting backend as ninja 00:02:15.383 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 128 00:02:16.763 CC lib/ut_mock/mock.o 00:02:16.763 CC lib/ut/ut.o 00:02:16.763 CC lib/log/log.o 00:02:16.763 CC lib/log/log_flags.o 00:02:16.763 CC lib/log/log_deprecated.o 00:02:16.763 LIB libspdk_ut_mock.a 00:02:16.763 LIB libspdk_log.a 00:02:16.763 LIB libspdk_ut.a 00:02:16.763 SO libspdk_ut_mock.so.6.0 00:02:16.763 SO libspdk_ut.so.2.0 00:02:16.763 SO libspdk_log.so.7.0 00:02:16.763 SYMLINK libspdk_ut_mock.so 00:02:16.763 SYMLINK libspdk_ut.so 00:02:16.763 SYMLINK libspdk_log.so 00:02:17.334 CXX lib/trace_parser/trace.o 00:02:17.334 CC lib/ioat/ioat.o 00:02:17.334 CC lib/util/base64.o 00:02:17.334 CC lib/dma/dma.o 00:02:17.334 CC lib/util/bit_array.o 00:02:17.334 CC lib/util/cpuset.o 00:02:17.334 CC lib/util/crc16.o 00:02:17.334 CC lib/util/crc32.o 00:02:17.334 CC lib/util/crc32_ieee.o 00:02:17.334 CC lib/util/crc32c.o 00:02:17.334 CC lib/util/crc64.o 00:02:17.334 CC lib/util/dif.o 00:02:17.334 CC lib/util/fd.o 00:02:17.334 CC lib/util/file.o 00:02:17.334 CC lib/util/hexlify.o 00:02:17.334 CC lib/util/iov.o 00:02:17.334 CC lib/util/math.o 00:02:17.334 CC lib/util/pipe.o 00:02:17.334 CC lib/util/strerror_tls.o 00:02:17.334 CC lib/util/string.o 00:02:17.334 CC lib/util/uuid.o 00:02:17.334 CC lib/util/fd_group.o 00:02:17.334 CC lib/util/xor.o 00:02:17.334 CC lib/util/zipf.o 00:02:17.334 CC lib/vfio_user/host/vfio_user_pci.o 00:02:17.334 CC lib/vfio_user/host/vfio_user.o 00:02:17.595 LIB libspdk_dma.a 00:02:17.595 SO libspdk_dma.so.4.0 00:02:17.595 LIB libspdk_ioat.a 00:02:17.595 SYMLINK libspdk_dma.so 00:02:17.595 SO libspdk_ioat.so.7.0 00:02:17.595 SYMLINK libspdk_ioat.so 00:02:17.595 LIB libspdk_vfio_user.a 00:02:17.855 LIB libspdk_util.a 00:02:17.855 SO libspdk_vfio_user.so.5.0 00:02:17.855 SO libspdk_util.so.9.1 00:02:17.855 SYMLINK libspdk_vfio_user.so 00:02:17.855 SYMLINK libspdk_util.so 00:02:18.116 LIB libspdk_trace_parser.a 00:02:18.116 SO libspdk_trace_parser.so.5.0 00:02:18.116 SYMLINK libspdk_trace_parser.so 00:02:18.375 CC lib/rdma_provider/common.o 00:02:18.375 CC lib/conf/conf.o 00:02:18.375 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:18.375 CC lib/env_dpdk/env.o 00:02:18.376 CC lib/reduce/reduce.o 00:02:18.376 CC lib/env_dpdk/memory.o 00:02:18.376 CC lib/env_dpdk/pci.o 00:02:18.376 CC lib/json/json_parse.o 00:02:18.376 CC lib/rdma_utils/rdma_utils.o 00:02:18.376 CC lib/env_dpdk/init.o 00:02:18.376 CC lib/json/json_util.o 00:02:18.376 CC lib/env_dpdk/threads.o 00:02:18.376 CC lib/env_dpdk/pci_ioat.o 00:02:18.376 CC lib/json/json_write.o 00:02:18.376 CC lib/env_dpdk/pci_virtio.o 00:02:18.376 CC lib/vmd/vmd.o 00:02:18.376 CC lib/env_dpdk/pci_vmd.o 00:02:18.376 CC lib/vmd/led.o 00:02:18.376 CC lib/env_dpdk/pci_idxd.o 00:02:18.376 CC lib/env_dpdk/pci_event.o 00:02:18.376 CC lib/env_dpdk/sigbus_handler.o 00:02:18.376 CC lib/idxd/idxd.o 00:02:18.376 CC lib/env_dpdk/pci_dpdk.o 00:02:18.376 CC lib/idxd/idxd_user.o 00:02:18.376 CC lib/idxd/idxd_kernel.o 00:02:18.376 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:18.376 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:18.636 LIB libspdk_rdma_provider.a 00:02:18.636 LIB libspdk_conf.a 00:02:18.636 LIB libspdk_rdma_utils.a 00:02:18.636 SO libspdk_conf.so.6.0 00:02:18.636 SO libspdk_rdma_provider.so.6.0 00:02:18.636 LIB libspdk_json.a 00:02:18.636 SO libspdk_rdma_utils.so.1.0 00:02:18.636 SO libspdk_json.so.6.0 00:02:18.636 SYMLINK libspdk_conf.so 00:02:18.636 SYMLINK libspdk_rdma_provider.so 00:02:18.636 SYMLINK libspdk_rdma_utils.so 00:02:18.896 SYMLINK libspdk_json.so 00:02:18.896 LIB libspdk_idxd.a 00:02:18.896 SO libspdk_idxd.so.12.0 00:02:18.896 LIB libspdk_vmd.a 00:02:18.896 SO libspdk_vmd.so.6.0 00:02:18.896 LIB libspdk_reduce.a 00:02:19.157 SO libspdk_reduce.so.6.0 00:02:19.157 SYMLINK libspdk_vmd.so 00:02:19.157 SYMLINK libspdk_idxd.so 00:02:19.157 SYMLINK libspdk_reduce.so 00:02:19.157 CC lib/jsonrpc/jsonrpc_server.o 00:02:19.157 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:19.157 CC lib/jsonrpc/jsonrpc_client.o 00:02:19.157 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:19.418 LIB libspdk_env_dpdk.a 00:02:19.679 SO libspdk_env_dpdk.so.14.1 00:02:19.679 LIB libspdk_jsonrpc.a 00:02:19.679 SYMLINK libspdk_env_dpdk.so 00:02:19.679 SO libspdk_jsonrpc.so.6.0 00:02:19.940 SYMLINK libspdk_jsonrpc.so 00:02:20.200 CC lib/rpc/rpc.o 00:02:20.461 LIB libspdk_rpc.a 00:02:20.461 SO libspdk_rpc.so.6.0 00:02:20.461 SYMLINK libspdk_rpc.so 00:02:21.032 CC lib/trace/trace.o 00:02:21.032 CC lib/trace/trace_flags.o 00:02:21.032 CC lib/trace/trace_rpc.o 00:02:21.032 CC lib/notify/notify.o 00:02:21.032 CC lib/notify/notify_rpc.o 00:02:21.032 CC lib/keyring/keyring.o 00:02:21.032 CC lib/keyring/keyring_rpc.o 00:02:21.032 LIB libspdk_notify.a 00:02:21.032 SO libspdk_notify.so.6.0 00:02:21.032 LIB libspdk_keyring.a 00:02:21.032 LIB libspdk_trace.a 00:02:21.032 SO libspdk_keyring.so.1.0 00:02:21.032 SO libspdk_trace.so.10.0 00:02:21.032 SYMLINK libspdk_notify.so 00:02:21.293 SYMLINK libspdk_keyring.so 00:02:21.293 SYMLINK libspdk_trace.so 00:02:21.553 CC lib/thread/thread.o 00:02:21.553 CC lib/thread/iobuf.o 00:02:21.553 CC lib/sock/sock.o 00:02:21.553 CC lib/sock/sock_rpc.o 00:02:21.813 LIB libspdk_sock.a 00:02:22.077 SO libspdk_sock.so.10.0 00:02:22.077 SYMLINK libspdk_sock.so 00:02:22.338 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:22.338 CC lib/nvme/nvme_ctrlr.o 00:02:22.338 CC lib/nvme/nvme_fabric.o 00:02:22.338 CC lib/nvme/nvme_ns_cmd.o 00:02:22.338 CC lib/nvme/nvme_ns.o 00:02:22.338 CC lib/nvme/nvme_pcie_common.o 00:02:22.338 CC lib/nvme/nvme_pcie.o 00:02:22.338 CC lib/nvme/nvme_qpair.o 00:02:22.338 CC lib/nvme/nvme.o 00:02:22.338 CC lib/nvme/nvme_quirks.o 00:02:22.338 CC lib/nvme/nvme_transport.o 00:02:22.338 CC lib/nvme/nvme_discovery.o 00:02:22.338 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:22.338 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:22.338 CC lib/nvme/nvme_tcp.o 00:02:22.338 CC lib/nvme/nvme_opal.o 00:02:22.338 CC lib/nvme/nvme_io_msg.o 00:02:22.338 CC lib/nvme/nvme_poll_group.o 00:02:22.338 CC lib/nvme/nvme_zns.o 00:02:22.338 CC lib/nvme/nvme_stubs.o 00:02:22.338 CC lib/nvme/nvme_auth.o 00:02:22.338 CC lib/nvme/nvme_cuse.o 00:02:22.338 CC lib/nvme/nvme_rdma.o 00:02:22.910 LIB libspdk_thread.a 00:02:22.910 SO libspdk_thread.so.10.1 00:02:22.910 SYMLINK libspdk_thread.so 00:02:23.483 CC lib/accel/accel.o 00:02:23.483 CC lib/accel/accel_rpc.o 00:02:23.483 CC lib/accel/accel_sw.o 00:02:23.483 CC lib/init/json_config.o 00:02:23.483 CC lib/init/subsystem.o 00:02:23.483 CC lib/init/subsystem_rpc.o 00:02:23.483 CC lib/init/rpc.o 00:02:23.483 CC lib/virtio/virtio.o 00:02:23.483 CC lib/virtio/virtio_vhost_user.o 00:02:23.483 CC lib/blob/blobstore.o 00:02:23.483 CC lib/virtio/virtio_vfio_user.o 00:02:23.483 CC lib/virtio/virtio_pci.o 00:02:23.483 CC lib/blob/request.o 00:02:23.483 CC lib/blob/zeroes.o 00:02:23.483 CC lib/blob/blob_bs_dev.o 00:02:23.483 LIB libspdk_init.a 00:02:23.743 SO libspdk_init.so.5.0 00:02:23.743 LIB libspdk_virtio.a 00:02:23.743 SO libspdk_virtio.so.7.0 00:02:23.743 SYMLINK libspdk_init.so 00:02:23.743 SYMLINK libspdk_virtio.so 00:02:24.004 CC lib/event/app.o 00:02:24.004 CC lib/event/reactor.o 00:02:24.004 CC lib/event/log_rpc.o 00:02:24.004 CC lib/event/app_rpc.o 00:02:24.004 CC lib/event/scheduler_static.o 00:02:24.004 LIB libspdk_accel.a 00:02:24.340 SO libspdk_accel.so.15.1 00:02:24.341 SYMLINK libspdk_accel.so 00:02:24.341 LIB libspdk_event.a 00:02:24.341 LIB libspdk_nvme.a 00:02:24.601 SO libspdk_event.so.14.0 00:02:24.601 SYMLINK libspdk_event.so 00:02:24.601 SO libspdk_nvme.so.13.1 00:02:24.601 CC lib/bdev/bdev.o 00:02:24.601 CC lib/bdev/bdev_rpc.o 00:02:24.601 CC lib/bdev/bdev_zone.o 00:02:24.601 CC lib/bdev/part.o 00:02:24.601 CC lib/bdev/scsi_nvme.o 00:02:24.862 SYMLINK libspdk_nvme.so 00:02:25.805 LIB libspdk_blob.a 00:02:25.805 SO libspdk_blob.so.11.0 00:02:25.805 SYMLINK libspdk_blob.so 00:02:26.377 CC lib/blobfs/blobfs.o 00:02:26.377 CC lib/lvol/lvol.o 00:02:26.377 CC lib/blobfs/tree.o 00:02:26.639 LIB libspdk_bdev.a 00:02:26.901 SO libspdk_bdev.so.15.1 00:02:26.901 SYMLINK libspdk_bdev.so 00:02:26.901 LIB libspdk_blobfs.a 00:02:26.901 SO libspdk_blobfs.so.10.0 00:02:27.162 LIB libspdk_lvol.a 00:02:27.162 SYMLINK libspdk_blobfs.so 00:02:27.162 SO libspdk_lvol.so.10.0 00:02:27.162 SYMLINK libspdk_lvol.so 00:02:27.162 CC lib/nvmf/ctrlr.o 00:02:27.162 CC lib/ftl/ftl_core.o 00:02:27.162 CC lib/nvmf/ctrlr_discovery.o 00:02:27.162 CC lib/ftl/ftl_init.o 00:02:27.162 CC lib/nvmf/ctrlr_bdev.o 00:02:27.162 CC lib/ftl/ftl_layout.o 00:02:27.162 CC lib/nvmf/subsystem.o 00:02:27.162 CC lib/ftl/ftl_debug.o 00:02:27.162 CC lib/nvmf/nvmf.o 00:02:27.162 CC lib/ftl/ftl_io.o 00:02:27.162 CC lib/ublk/ublk.o 00:02:27.162 CC lib/nvmf/nvmf_rpc.o 00:02:27.162 CC lib/ftl/ftl_sb.o 00:02:27.162 CC lib/ublk/ublk_rpc.o 00:02:27.162 CC lib/nvmf/transport.o 00:02:27.162 CC lib/ftl/ftl_l2p.o 00:02:27.162 CC lib/nvmf/tcp.o 00:02:27.162 CC lib/ftl/ftl_l2p_flat.o 00:02:27.162 CC lib/nvmf/stubs.o 00:02:27.162 CC lib/ftl/ftl_nv_cache.o 00:02:27.162 CC lib/nvmf/mdns_server.o 00:02:27.162 CC lib/scsi/dev.o 00:02:27.162 CC lib/ftl/ftl_band.o 00:02:27.162 CC lib/nvmf/auth.o 00:02:27.162 CC lib/nbd/nbd.o 00:02:27.162 CC lib/nvmf/rdma.o 00:02:27.162 CC lib/ftl/ftl_band_ops.o 00:02:27.162 CC lib/scsi/lun.o 00:02:27.162 CC lib/nbd/nbd_rpc.o 00:02:27.162 CC lib/ftl/ftl_writer.o 00:02:27.162 CC lib/scsi/port.o 00:02:27.162 CC lib/scsi/scsi.o 00:02:27.162 CC lib/ftl/ftl_rq.o 00:02:27.162 CC lib/ftl/ftl_reloc.o 00:02:27.162 CC lib/scsi/scsi_bdev.o 00:02:27.162 CC lib/ftl/ftl_l2p_cache.o 00:02:27.162 CC lib/scsi/scsi_pr.o 00:02:27.162 CC lib/scsi/scsi_rpc.o 00:02:27.162 CC lib/ftl/ftl_p2l.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt.o 00:02:27.162 CC lib/scsi/task.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:27.162 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:27.421 CC lib/ftl/utils/ftl_conf.o 00:02:27.421 CC lib/ftl/utils/ftl_md.o 00:02:27.421 CC lib/ftl/utils/ftl_mempool.o 00:02:27.421 CC lib/ftl/utils/ftl_bitmap.o 00:02:27.421 CC lib/ftl/utils/ftl_property.o 00:02:27.421 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:27.421 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:27.421 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:27.421 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:27.421 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:27.421 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:27.421 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:27.421 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:27.421 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:27.421 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:27.421 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:27.421 CC lib/ftl/base/ftl_base_dev.o 00:02:27.421 CC lib/ftl/base/ftl_base_bdev.o 00:02:27.421 CC lib/ftl/ftl_trace.o 00:02:27.991 LIB libspdk_scsi.a 00:02:27.991 SO libspdk_scsi.so.9.0 00:02:27.991 LIB libspdk_ublk.a 00:02:27.991 SYMLINK libspdk_scsi.so 00:02:27.991 SO libspdk_ublk.so.3.0 00:02:28.251 SYMLINK libspdk_ublk.so 00:02:28.251 LIB libspdk_nbd.a 00:02:28.251 LIB libspdk_ftl.a 00:02:28.511 SO libspdk_nbd.so.7.0 00:02:28.511 CC lib/vhost/vhost.o 00:02:28.511 CC lib/vhost/vhost_rpc.o 00:02:28.511 CC lib/vhost/vhost_scsi.o 00:02:28.511 CC lib/iscsi/conn.o 00:02:28.511 CC lib/vhost/vhost_blk.o 00:02:28.511 CC lib/iscsi/init_grp.o 00:02:28.511 CC lib/vhost/rte_vhost_user.o 00:02:28.511 CC lib/iscsi/iscsi.o 00:02:28.511 CC lib/iscsi/md5.o 00:02:28.511 CC lib/iscsi/param.o 00:02:28.511 CC lib/iscsi/portal_grp.o 00:02:28.511 CC lib/iscsi/tgt_node.o 00:02:28.511 CC lib/iscsi/iscsi_subsystem.o 00:02:28.511 CC lib/iscsi/iscsi_rpc.o 00:02:28.511 CC lib/iscsi/task.o 00:02:28.511 SYMLINK libspdk_nbd.so 00:02:28.511 SO libspdk_ftl.so.9.0 00:02:29.450 LIB libspdk_nvmf.a 00:02:29.450 SYMLINK libspdk_ftl.so 00:02:29.450 SO libspdk_nvmf.so.19.0 00:02:29.450 SYMLINK libspdk_nvmf.so 00:02:29.710 LIB libspdk_iscsi.a 00:02:29.710 SO libspdk_iscsi.so.8.0 00:02:29.710 SYMLINK libspdk_iscsi.so 00:02:30.650 LIB libspdk_vhost.a 00:02:30.650 SO libspdk_vhost.so.8.0 00:02:30.650 SYMLINK libspdk_vhost.so 00:02:31.224 CC module/env_dpdk/env_dpdk_rpc.o 00:02:31.484 LIB libspdk_env_dpdk_rpc.a 00:02:31.484 CC module/accel/ioat/accel_ioat.o 00:02:31.484 CC module/sock/posix/posix.o 00:02:31.484 CC module/accel/ioat/accel_ioat_rpc.o 00:02:31.484 CC module/blob/bdev/blob_bdev.o 00:02:31.484 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:31.484 CC module/accel/error/accel_error.o 00:02:31.484 CC module/accel/error/accel_error_rpc.o 00:02:31.484 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:31.484 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:31.484 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:31.484 CC module/keyring/linux/keyring.o 00:02:31.484 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:31.484 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:31.484 CC module/keyring/linux/keyring_rpc.o 00:02:31.484 CC module/keyring/file/keyring.o 00:02:31.484 CC module/accel/iaa/accel_iaa_rpc.o 00:02:31.484 CC module/accel/iaa/accel_iaa.o 00:02:31.485 CC module/accel/dsa/accel_dsa.o 00:02:31.485 CC module/keyring/file/keyring_rpc.o 00:02:31.485 CC module/accel/dsa/accel_dsa_rpc.o 00:02:31.485 CC module/scheduler/gscheduler/gscheduler.o 00:02:31.485 SO libspdk_env_dpdk_rpc.so.6.0 00:02:31.485 SYMLINK libspdk_env_dpdk_rpc.so 00:02:31.745 LIB libspdk_keyring_linux.a 00:02:31.745 LIB libspdk_keyring_file.a 00:02:31.745 LIB libspdk_scheduler_dpdk_governor.a 00:02:31.745 LIB libspdk_accel_error.a 00:02:31.745 LIB libspdk_scheduler_gscheduler.a 00:02:31.745 LIB libspdk_scheduler_dynamic.a 00:02:31.745 SO libspdk_keyring_linux.so.1.0 00:02:31.745 SO libspdk_keyring_file.so.1.0 00:02:31.745 SO libspdk_accel_error.so.2.0 00:02:31.745 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:31.745 LIB libspdk_accel_iaa.a 00:02:31.745 SO libspdk_scheduler_gscheduler.so.4.0 00:02:31.745 SO libspdk_scheduler_dynamic.so.4.0 00:02:31.745 LIB libspdk_blob_bdev.a 00:02:31.745 LIB libspdk_accel_dsa.a 00:02:31.745 SYMLINK libspdk_keyring_file.so 00:02:31.745 SO libspdk_accel_iaa.so.3.0 00:02:31.745 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:31.745 SYMLINK libspdk_accel_error.so 00:02:31.745 SYMLINK libspdk_keyring_linux.so 00:02:31.745 SO libspdk_blob_bdev.so.11.0 00:02:31.745 LIB libspdk_accel_ioat.a 00:02:31.745 SYMLINK libspdk_scheduler_gscheduler.so 00:02:31.745 SO libspdk_accel_dsa.so.5.0 00:02:31.745 SYMLINK libspdk_scheduler_dynamic.so 00:02:31.745 SYMLINK libspdk_accel_iaa.so 00:02:31.745 SO libspdk_accel_ioat.so.6.0 00:02:32.006 SYMLINK libspdk_blob_bdev.so 00:02:32.006 SYMLINK libspdk_accel_dsa.so 00:02:32.006 SYMLINK libspdk_accel_ioat.so 00:02:32.268 LIB libspdk_sock_posix.a 00:02:32.268 SO libspdk_sock_posix.so.6.0 00:02:32.268 SYMLINK libspdk_sock_posix.so 00:02:32.527 LIB libspdk_accel_dpdk_compressdev.a 00:02:32.527 CC module/bdev/gpt/gpt.o 00:02:32.527 CC module/bdev/gpt/vbdev_gpt.o 00:02:32.527 CC module/bdev/error/vbdev_error.o 00:02:32.527 CC module/bdev/delay/vbdev_delay.o 00:02:32.527 CC module/bdev/error/vbdev_error_rpc.o 00:02:32.527 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:32.527 CC module/bdev/lvol/vbdev_lvol.o 00:02:32.527 CC module/blobfs/bdev/blobfs_bdev.o 00:02:32.527 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:32.527 CC module/bdev/split/vbdev_split.o 00:02:32.527 CC module/bdev/malloc/bdev_malloc.o 00:02:32.527 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:32.527 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:32.527 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:32.527 CC module/bdev/passthru/vbdev_passthru.o 00:02:32.527 CC module/bdev/raid/bdev_raid.o 00:02:32.527 CC module/bdev/split/vbdev_split_rpc.o 00:02:32.527 CC module/bdev/raid/bdev_raid_rpc.o 00:02:32.527 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:32.527 CC module/bdev/raid/bdev_raid_sb.o 00:02:32.527 CC module/bdev/null/bdev_null.o 00:02:32.527 CC module/bdev/nvme/bdev_nvme.o 00:02:32.527 CC module/bdev/null/bdev_null_rpc.o 00:02:32.527 CC module/bdev/raid/raid0.o 00:02:32.527 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:32.527 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:32.527 CC module/bdev/raid/raid1.o 00:02:32.527 CC module/bdev/raid/concat.o 00:02:32.527 CC module/bdev/nvme/nvme_rpc.o 00:02:32.527 CC module/bdev/compress/vbdev_compress.o 00:02:32.527 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:32.527 CC module/bdev/nvme/bdev_mdns_client.o 00:02:32.527 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:32.527 CC module/bdev/nvme/vbdev_opal.o 00:02:32.527 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:32.527 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:32.527 CC module/bdev/aio/bdev_aio_rpc.o 00:02:32.527 CC module/bdev/aio/bdev_aio.o 00:02:32.527 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:32.527 CC module/bdev/iscsi/bdev_iscsi.o 00:02:32.527 CC module/bdev/ftl/bdev_ftl.o 00:02:32.527 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:32.527 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:32.527 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:32.527 CC module/bdev/crypto/vbdev_crypto.o 00:02:32.527 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:32.527 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:32.527 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:32.527 LIB libspdk_accel_dpdk_cryptodev.a 00:02:32.786 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:32.786 LIB libspdk_blobfs_bdev.a 00:02:32.786 SO libspdk_blobfs_bdev.so.6.0 00:02:32.786 LIB libspdk_bdev_split.a 00:02:32.786 LIB libspdk_bdev_null.a 00:02:32.786 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:32.786 LIB libspdk_bdev_error.a 00:02:32.786 LIB libspdk_bdev_gpt.a 00:02:32.786 SO libspdk_bdev_null.so.6.0 00:02:32.786 SO libspdk_bdev_split.so.6.0 00:02:32.786 SO libspdk_bdev_error.so.6.0 00:02:32.786 SYMLINK libspdk_blobfs_bdev.so 00:02:32.786 SO libspdk_bdev_gpt.so.6.0 00:02:32.786 LIB libspdk_bdev_passthru.a 00:02:32.786 LIB libspdk_bdev_ftl.a 00:02:32.786 LIB libspdk_bdev_malloc.a 00:02:32.786 SO libspdk_bdev_passthru.so.6.0 00:02:32.786 LIB libspdk_bdev_aio.a 00:02:32.786 SYMLINK libspdk_bdev_null.so 00:02:32.786 LIB libspdk_bdev_crypto.a 00:02:32.786 SYMLINK libspdk_bdev_split.so 00:02:32.786 SO libspdk_bdev_ftl.so.6.0 00:02:32.787 SYMLINK libspdk_bdev_error.so 00:02:33.048 SYMLINK libspdk_bdev_gpt.so 00:02:33.048 SO libspdk_bdev_malloc.so.6.0 00:02:33.048 LIB libspdk_bdev_compress.a 00:02:33.048 SO libspdk_bdev_aio.so.6.0 00:02:33.048 SO libspdk_bdev_crypto.so.6.0 00:02:33.048 LIB libspdk_bdev_iscsi.a 00:02:33.048 SYMLINK libspdk_bdev_passthru.so 00:02:33.048 SO libspdk_bdev_compress.so.6.0 00:02:33.048 SYMLINK libspdk_bdev_ftl.so 00:02:33.048 SO libspdk_bdev_iscsi.so.6.0 00:02:33.048 SYMLINK libspdk_bdev_malloc.so 00:02:33.048 SYMLINK libspdk_bdev_aio.so 00:02:33.048 SYMLINK libspdk_bdev_crypto.so 00:02:33.048 LIB libspdk_bdev_virtio.a 00:02:33.048 SYMLINK libspdk_bdev_compress.so 00:02:33.048 SYMLINK libspdk_bdev_iscsi.so 00:02:33.048 SO libspdk_bdev_virtio.so.6.0 00:02:33.048 SYMLINK libspdk_bdev_virtio.so 00:02:33.310 LIB libspdk_bdev_zone_block.a 00:02:33.310 SO libspdk_bdev_zone_block.so.6.0 00:02:33.310 LIB libspdk_bdev_delay.a 00:02:33.310 LIB libspdk_bdev_raid.a 00:02:33.310 SO libspdk_bdev_delay.so.6.0 00:02:33.310 SYMLINK libspdk_bdev_zone_block.so 00:02:33.310 SO libspdk_bdev_raid.so.6.0 00:02:33.570 SYMLINK libspdk_bdev_delay.so 00:02:33.570 SYMLINK libspdk_bdev_raid.so 00:02:33.570 LIB libspdk_bdev_lvol.a 00:02:33.831 SO libspdk_bdev_lvol.so.6.0 00:02:33.831 SYMLINK libspdk_bdev_lvol.so 00:02:34.401 LIB libspdk_bdev_nvme.a 00:02:34.401 SO libspdk_bdev_nvme.so.7.0 00:02:34.662 SYMLINK libspdk_bdev_nvme.so 00:02:35.235 CC module/event/subsystems/sock/sock.o 00:02:35.235 CC module/event/subsystems/vmd/vmd.o 00:02:35.235 CC module/event/subsystems/iobuf/iobuf.o 00:02:35.235 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:35.235 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:35.235 CC module/event/subsystems/scheduler/scheduler.o 00:02:35.235 CC module/event/subsystems/keyring/keyring.o 00:02:35.235 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:35.499 LIB libspdk_event_vmd.a 00:02:35.499 LIB libspdk_event_keyring.a 00:02:35.499 LIB libspdk_event_sock.a 00:02:35.499 LIB libspdk_event_vhost_blk.a 00:02:35.499 LIB libspdk_event_scheduler.a 00:02:35.499 SO libspdk_event_vmd.so.6.0 00:02:35.499 SO libspdk_event_keyring.so.1.0 00:02:35.499 SO libspdk_event_sock.so.5.0 00:02:35.499 SO libspdk_event_vhost_blk.so.3.0 00:02:35.499 SO libspdk_event_scheduler.so.4.0 00:02:35.499 SYMLINK libspdk_event_vmd.so 00:02:35.499 SYMLINK libspdk_event_keyring.so 00:02:35.499 SYMLINK libspdk_event_sock.so 00:02:35.499 SYMLINK libspdk_event_vhost_blk.so 00:02:35.499 SYMLINK libspdk_event_scheduler.so 00:02:35.759 LIB libspdk_event_iobuf.a 00:02:35.759 SO libspdk_event_iobuf.so.3.0 00:02:35.759 SYMLINK libspdk_event_iobuf.so 00:02:36.331 CC module/event/subsystems/accel/accel.o 00:02:36.331 LIB libspdk_event_accel.a 00:02:36.331 SO libspdk_event_accel.so.6.0 00:02:36.331 SYMLINK libspdk_event_accel.so 00:02:36.903 CC module/event/subsystems/bdev/bdev.o 00:02:36.903 LIB libspdk_event_bdev.a 00:02:36.903 SO libspdk_event_bdev.so.6.0 00:02:37.164 SYMLINK libspdk_event_bdev.so 00:02:37.425 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:37.425 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:37.425 CC module/event/subsystems/nbd/nbd.o 00:02:37.425 CC module/event/subsystems/scsi/scsi.o 00:02:37.425 CC module/event/subsystems/ublk/ublk.o 00:02:37.685 LIB libspdk_event_nbd.a 00:02:37.685 LIB libspdk_event_ublk.a 00:02:37.685 LIB libspdk_event_scsi.a 00:02:37.685 SO libspdk_event_nbd.so.6.0 00:02:37.685 SO libspdk_event_scsi.so.6.0 00:02:37.685 SO libspdk_event_ublk.so.3.0 00:02:37.685 LIB libspdk_event_nvmf.a 00:02:37.685 SYMLINK libspdk_event_nbd.so 00:02:37.685 SYMLINK libspdk_event_ublk.so 00:02:37.685 SYMLINK libspdk_event_scsi.so 00:02:37.685 SO libspdk_event_nvmf.so.6.0 00:02:37.685 SYMLINK libspdk_event_nvmf.so 00:02:37.946 CC module/event/subsystems/iscsi/iscsi.o 00:02:38.206 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:38.206 LIB libspdk_event_vhost_scsi.a 00:02:38.206 LIB libspdk_event_iscsi.a 00:02:38.206 SO libspdk_event_vhost_scsi.so.3.0 00:02:38.206 SO libspdk_event_iscsi.so.6.0 00:02:38.466 SYMLINK libspdk_event_vhost_scsi.so 00:02:38.466 SYMLINK libspdk_event_iscsi.so 00:02:38.466 SO libspdk.so.6.0 00:02:38.466 SYMLINK libspdk.so 00:02:39.054 CC app/spdk_nvme_discover/discovery_aer.o 00:02:39.054 CC app/spdk_lspci/spdk_lspci.o 00:02:39.054 CC app/trace_record/trace_record.o 00:02:39.054 CC app/spdk_nvme_perf/perf.o 00:02:39.054 CC app/spdk_nvme_identify/identify.o 00:02:39.054 CXX app/trace/trace.o 00:02:39.054 CC app/spdk_top/spdk_top.o 00:02:39.054 TEST_HEADER include/spdk/accel.h 00:02:39.054 CC test/rpc_client/rpc_client_test.o 00:02:39.054 TEST_HEADER include/spdk/accel_module.h 00:02:39.054 TEST_HEADER include/spdk/assert.h 00:02:39.054 TEST_HEADER include/spdk/base64.h 00:02:39.054 TEST_HEADER include/spdk/barrier.h 00:02:39.054 TEST_HEADER include/spdk/bdev_module.h 00:02:39.054 TEST_HEADER include/spdk/bdev.h 00:02:39.054 TEST_HEADER include/spdk/bdev_zone.h 00:02:39.054 TEST_HEADER include/spdk/bit_pool.h 00:02:39.054 TEST_HEADER include/spdk/bit_array.h 00:02:39.054 TEST_HEADER include/spdk/blob_bdev.h 00:02:39.054 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:39.054 TEST_HEADER include/spdk/blobfs.h 00:02:39.054 TEST_HEADER include/spdk/blob.h 00:02:39.054 TEST_HEADER include/spdk/conf.h 00:02:39.054 TEST_HEADER include/spdk/cpuset.h 00:02:39.054 TEST_HEADER include/spdk/config.h 00:02:39.054 TEST_HEADER include/spdk/crc16.h 00:02:39.054 TEST_HEADER include/spdk/crc64.h 00:02:39.054 TEST_HEADER include/spdk/crc32.h 00:02:39.054 TEST_HEADER include/spdk/dif.h 00:02:39.054 TEST_HEADER include/spdk/dma.h 00:02:39.054 TEST_HEADER include/spdk/endian.h 00:02:39.054 TEST_HEADER include/spdk/env_dpdk.h 00:02:39.054 TEST_HEADER include/spdk/env.h 00:02:39.054 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:39.054 TEST_HEADER include/spdk/event.h 00:02:39.054 TEST_HEADER include/spdk/fd.h 00:02:39.054 TEST_HEADER include/spdk/fd_group.h 00:02:39.054 TEST_HEADER include/spdk/file.h 00:02:39.054 TEST_HEADER include/spdk/ftl.h 00:02:39.054 CC app/nvmf_tgt/nvmf_main.o 00:02:39.054 TEST_HEADER include/spdk/hexlify.h 00:02:39.054 TEST_HEADER include/spdk/gpt_spec.h 00:02:39.054 TEST_HEADER include/spdk/histogram_data.h 00:02:39.054 CC app/spdk_dd/spdk_dd.o 00:02:39.054 TEST_HEADER include/spdk/idxd.h 00:02:39.054 TEST_HEADER include/spdk/init.h 00:02:39.054 TEST_HEADER include/spdk/idxd_spec.h 00:02:39.054 CC app/iscsi_tgt/iscsi_tgt.o 00:02:39.054 TEST_HEADER include/spdk/ioat_spec.h 00:02:39.055 TEST_HEADER include/spdk/ioat.h 00:02:39.055 TEST_HEADER include/spdk/json.h 00:02:39.055 TEST_HEADER include/spdk/iscsi_spec.h 00:02:39.055 TEST_HEADER include/spdk/jsonrpc.h 00:02:39.055 TEST_HEADER include/spdk/keyring.h 00:02:39.055 TEST_HEADER include/spdk/likely.h 00:02:39.055 TEST_HEADER include/spdk/keyring_module.h 00:02:39.055 TEST_HEADER include/spdk/log.h 00:02:39.055 CC app/spdk_tgt/spdk_tgt.o 00:02:39.055 TEST_HEADER include/spdk/lvol.h 00:02:39.055 TEST_HEADER include/spdk/memory.h 00:02:39.055 TEST_HEADER include/spdk/mmio.h 00:02:39.055 TEST_HEADER include/spdk/nbd.h 00:02:39.055 TEST_HEADER include/spdk/notify.h 00:02:39.055 TEST_HEADER include/spdk/nvme.h 00:02:39.055 TEST_HEADER include/spdk/nvme_intel.h 00:02:39.055 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:39.055 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:39.055 TEST_HEADER include/spdk/nvme_spec.h 00:02:39.055 TEST_HEADER include/spdk/nvme_zns.h 00:02:39.055 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:39.055 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:39.055 TEST_HEADER include/spdk/nvmf.h 00:02:39.055 TEST_HEADER include/spdk/nvmf_spec.h 00:02:39.055 TEST_HEADER include/spdk/nvmf_transport.h 00:02:39.055 TEST_HEADER include/spdk/opal.h 00:02:39.055 TEST_HEADER include/spdk/opal_spec.h 00:02:39.055 TEST_HEADER include/spdk/pci_ids.h 00:02:39.055 TEST_HEADER include/spdk/pipe.h 00:02:39.055 TEST_HEADER include/spdk/queue.h 00:02:39.055 TEST_HEADER include/spdk/reduce.h 00:02:39.055 TEST_HEADER include/spdk/rpc.h 00:02:39.055 TEST_HEADER include/spdk/scheduler.h 00:02:39.055 TEST_HEADER include/spdk/scsi.h 00:02:39.055 TEST_HEADER include/spdk/scsi_spec.h 00:02:39.055 TEST_HEADER include/spdk/sock.h 00:02:39.055 TEST_HEADER include/spdk/string.h 00:02:39.055 TEST_HEADER include/spdk/stdinc.h 00:02:39.055 TEST_HEADER include/spdk/trace.h 00:02:39.055 TEST_HEADER include/spdk/thread.h 00:02:39.055 TEST_HEADER include/spdk/trace_parser.h 00:02:39.055 TEST_HEADER include/spdk/tree.h 00:02:39.055 TEST_HEADER include/spdk/util.h 00:02:39.055 TEST_HEADER include/spdk/ublk.h 00:02:39.055 TEST_HEADER include/spdk/uuid.h 00:02:39.055 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:39.055 TEST_HEADER include/spdk/version.h 00:02:39.055 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:39.055 TEST_HEADER include/spdk/vhost.h 00:02:39.055 TEST_HEADER include/spdk/vmd.h 00:02:39.055 TEST_HEADER include/spdk/xor.h 00:02:39.055 TEST_HEADER include/spdk/zipf.h 00:02:39.055 CXX test/cpp_headers/accel.o 00:02:39.055 CXX test/cpp_headers/accel_module.o 00:02:39.055 CXX test/cpp_headers/assert.o 00:02:39.055 CXX test/cpp_headers/barrier.o 00:02:39.055 CXX test/cpp_headers/base64.o 00:02:39.055 CXX test/cpp_headers/bdev.o 00:02:39.055 CXX test/cpp_headers/bdev_module.o 00:02:39.055 CXX test/cpp_headers/bdev_zone.o 00:02:39.055 CXX test/cpp_headers/bit_array.o 00:02:39.055 CXX test/cpp_headers/blobfs_bdev.o 00:02:39.055 CXX test/cpp_headers/bit_pool.o 00:02:39.055 CXX test/cpp_headers/blob_bdev.o 00:02:39.055 CXX test/cpp_headers/blobfs.o 00:02:39.055 CXX test/cpp_headers/blob.o 00:02:39.055 CXX test/cpp_headers/crc16.o 00:02:39.055 CXX test/cpp_headers/conf.o 00:02:39.055 CXX test/cpp_headers/config.o 00:02:39.055 CXX test/cpp_headers/cpuset.o 00:02:39.055 CXX test/cpp_headers/crc32.o 00:02:39.055 CXX test/cpp_headers/crc64.o 00:02:39.055 CXX test/cpp_headers/dif.o 00:02:39.055 CXX test/cpp_headers/dma.o 00:02:39.055 CXX test/cpp_headers/endian.o 00:02:39.055 CXX test/cpp_headers/env.o 00:02:39.055 CXX test/cpp_headers/env_dpdk.o 00:02:39.055 CXX test/cpp_headers/event.o 00:02:39.055 CXX test/cpp_headers/fd_group.o 00:02:39.055 CXX test/cpp_headers/file.o 00:02:39.055 CXX test/cpp_headers/fd.o 00:02:39.055 CXX test/cpp_headers/ftl.o 00:02:39.055 CXX test/cpp_headers/hexlify.o 00:02:39.055 CXX test/cpp_headers/gpt_spec.o 00:02:39.055 CXX test/cpp_headers/idxd.o 00:02:39.055 CXX test/cpp_headers/histogram_data.o 00:02:39.055 CXX test/cpp_headers/init.o 00:02:39.055 CXX test/cpp_headers/idxd_spec.o 00:02:39.055 CXX test/cpp_headers/iscsi_spec.o 00:02:39.055 CXX test/cpp_headers/ioat.o 00:02:39.055 CXX test/cpp_headers/jsonrpc.o 00:02:39.055 CXX test/cpp_headers/ioat_spec.o 00:02:39.055 CXX test/cpp_headers/json.o 00:02:39.055 CXX test/cpp_headers/keyring.o 00:02:39.055 CXX test/cpp_headers/keyring_module.o 00:02:39.055 CXX test/cpp_headers/lvol.o 00:02:39.055 CXX test/cpp_headers/likely.o 00:02:39.055 CXX test/cpp_headers/log.o 00:02:39.055 CXX test/cpp_headers/mmio.o 00:02:39.055 CXX test/cpp_headers/memory.o 00:02:39.055 CC examples/ioat/perf/perf.o 00:02:39.055 CXX test/cpp_headers/notify.o 00:02:39.055 CXX test/cpp_headers/nvme.o 00:02:39.055 CXX test/cpp_headers/nbd.o 00:02:39.055 CXX test/cpp_headers/nvme_ocssd.o 00:02:39.055 CXX test/cpp_headers/nvme_spec.o 00:02:39.055 CXX test/cpp_headers/nvme_intel.o 00:02:39.055 CXX test/cpp_headers/nvme_zns.o 00:02:39.318 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:39.318 CXX test/cpp_headers/nvmf.o 00:02:39.318 CXX test/cpp_headers/nvmf_cmd.o 00:02:39.318 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:39.318 CC test/env/vtophys/vtophys.o 00:02:39.318 CXX test/cpp_headers/nvmf_spec.o 00:02:39.318 CXX test/cpp_headers/nvmf_transport.o 00:02:39.318 CXX test/cpp_headers/opal_spec.o 00:02:39.318 CC examples/util/zipf/zipf.o 00:02:39.318 CC test/env/pci/pci_ut.o 00:02:39.318 LINK spdk_lspci 00:02:39.318 CXX test/cpp_headers/pci_ids.o 00:02:39.318 CC examples/ioat/verify/verify.o 00:02:39.318 CXX test/cpp_headers/pipe.o 00:02:39.318 CXX test/cpp_headers/opal.o 00:02:39.318 CC test/thread/poller_perf/poller_perf.o 00:02:39.318 CXX test/cpp_headers/queue.o 00:02:39.318 CXX test/cpp_headers/reduce.o 00:02:39.318 CXX test/cpp_headers/rpc.o 00:02:39.318 CXX test/cpp_headers/scheduler.o 00:02:39.318 CXX test/cpp_headers/scsi.o 00:02:39.318 CXX test/cpp_headers/scsi_spec.o 00:02:39.318 CXX test/cpp_headers/sock.o 00:02:39.318 CXX test/cpp_headers/stdinc.o 00:02:39.318 CXX test/cpp_headers/string.o 00:02:39.318 CXX test/cpp_headers/thread.o 00:02:39.318 CXX test/cpp_headers/trace.o 00:02:39.319 CXX test/cpp_headers/trace_parser.o 00:02:39.319 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:39.319 CXX test/cpp_headers/tree.o 00:02:39.319 CXX test/cpp_headers/ublk.o 00:02:39.319 CXX test/cpp_headers/util.o 00:02:39.319 CXX test/cpp_headers/uuid.o 00:02:39.319 CXX test/cpp_headers/vfio_user_pci.o 00:02:39.319 CC test/env/memory/memory_ut.o 00:02:39.319 CXX test/cpp_headers/version.o 00:02:39.319 CC app/fio/nvme/fio_plugin.o 00:02:39.319 LINK spdk_nvme_discover 00:02:39.319 CXX test/cpp_headers/xor.o 00:02:39.319 CXX test/cpp_headers/vhost.o 00:02:39.319 CXX test/cpp_headers/vfio_user_spec.o 00:02:39.319 CC test/app/jsoncat/jsoncat.o 00:02:39.319 CXX test/cpp_headers/vmd.o 00:02:39.319 CXX test/cpp_headers/zipf.o 00:02:39.319 CC test/dma/test_dma/test_dma.o 00:02:39.319 CC test/app/stub/stub.o 00:02:39.319 CC test/app/histogram_perf/histogram_perf.o 00:02:39.319 CC app/fio/bdev/fio_plugin.o 00:02:39.319 LINK rpc_client_test 00:02:39.589 CC test/app/bdev_svc/bdev_svc.o 00:02:39.589 LINK interrupt_tgt 00:02:39.589 LINK vtophys 00:02:39.589 LINK spdk_trace_record 00:02:39.589 LINK spdk_tgt 00:02:39.589 LINK nvmf_tgt 00:02:39.856 LINK histogram_perf 00:02:39.856 LINK iscsi_tgt 00:02:39.856 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:39.856 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:39.856 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:39.856 LINK spdk_dd 00:02:39.856 CC test/env/mem_callbacks/mem_callbacks.o 00:02:39.856 LINK zipf 00:02:39.856 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:40.115 LINK poller_perf 00:02:40.115 LINK jsoncat 00:02:40.115 LINK verify 00:02:40.375 LINK ioat_perf 00:02:40.375 LINK env_dpdk_post_init 00:02:40.375 LINK stub 00:02:40.375 LINK bdev_svc 00:02:40.376 LINK spdk_trace 00:02:40.635 LINK pci_ut 00:02:40.635 LINK spdk_nvme_identify 00:02:40.635 LINK test_dma 00:02:40.635 CC examples/vmd/led/led.o 00:02:40.635 CC examples/idxd/perf/perf.o 00:02:40.635 LINK spdk_bdev 00:02:40.635 CC examples/vmd/lsvmd/lsvmd.o 00:02:40.635 CC examples/sock/hello_world/hello_sock.o 00:02:40.635 CC examples/thread/thread/thread_ex.o 00:02:40.635 LINK nvme_fuzz 00:02:40.635 LINK vhost_fuzz 00:02:40.635 CC test/event/event_perf/event_perf.o 00:02:40.635 CC test/event/reactor_perf/reactor_perf.o 00:02:40.635 LINK spdk_nvme_perf 00:02:40.636 CC test/event/reactor/reactor.o 00:02:40.636 CC test/event/app_repeat/app_repeat.o 00:02:40.895 CC test/event/scheduler/scheduler.o 00:02:40.895 LINK lsvmd 00:02:40.895 LINK mem_callbacks 00:02:40.895 LINK spdk_top 00:02:40.895 CC app/vhost/vhost.o 00:02:40.895 LINK spdk_nvme 00:02:40.895 LINK hello_sock 00:02:40.895 LINK reactor_perf 00:02:40.895 LINK event_perf 00:02:40.895 LINK reactor 00:02:40.895 LINK thread 00:02:40.895 LINK idxd_perf 00:02:40.895 LINK app_repeat 00:02:40.895 LINK led 00:02:41.155 LINK scheduler 00:02:41.155 LINK vhost 00:02:41.155 CC test/nvme/overhead/overhead.o 00:02:41.155 CC test/nvme/sgl/sgl.o 00:02:41.155 CC test/nvme/reset/reset.o 00:02:41.155 CC test/nvme/err_injection/err_injection.o 00:02:41.155 CC test/nvme/e2edp/nvme_dp.o 00:02:41.155 CC test/blobfs/mkfs/mkfs.o 00:02:41.155 CC test/nvme/startup/startup.o 00:02:41.155 CC test/nvme/cuse/cuse.o 00:02:41.155 CC test/nvme/connect_stress/connect_stress.o 00:02:41.155 CC test/nvme/aer/aer.o 00:02:41.155 CC test/nvme/boot_partition/boot_partition.o 00:02:41.155 CC test/nvme/reserve/reserve.o 00:02:41.155 CC test/nvme/simple_copy/simple_copy.o 00:02:41.155 CC test/nvme/fdp/fdp.o 00:02:41.155 LINK memory_ut 00:02:41.155 CC test/nvme/compliance/nvme_compliance.o 00:02:41.155 CC test/nvme/fused_ordering/fused_ordering.o 00:02:41.155 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:41.155 CC test/accel/dif/dif.o 00:02:41.415 CC test/lvol/esnap/esnap.o 00:02:41.415 LINK boot_partition 00:02:41.415 LINK connect_stress 00:02:41.415 LINK err_injection 00:02:41.415 LINK doorbell_aers 00:02:41.415 LINK startup 00:02:41.415 LINK fused_ordering 00:02:41.415 LINK mkfs 00:02:41.415 CC examples/nvme/hello_world/hello_world.o 00:02:41.415 LINK reserve 00:02:41.415 CC examples/nvme/arbitration/arbitration.o 00:02:41.415 CC examples/nvme/reconnect/reconnect.o 00:02:41.415 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:41.415 CC examples/nvme/abort/abort.o 00:02:41.415 CC examples/nvme/hotplug/hotplug.o 00:02:41.415 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:41.415 LINK nvme_dp 00:02:41.415 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:41.415 LINK simple_copy 00:02:41.415 LINK sgl 00:02:41.415 LINK reset 00:02:41.415 LINK overhead 00:02:41.415 LINK aer 00:02:41.415 LINK nvme_compliance 00:02:41.415 LINK fdp 00:02:41.415 LINK iscsi_fuzz 00:02:41.689 CC examples/blob/hello_world/hello_blob.o 00:02:41.689 CC examples/accel/perf/accel_perf.o 00:02:41.689 CC examples/blob/cli/blobcli.o 00:02:41.689 LINK dif 00:02:41.689 LINK pmr_persistence 00:02:41.689 LINK hotplug 00:02:41.689 LINK cmb_copy 00:02:41.689 LINK hello_world 00:02:41.689 LINK arbitration 00:02:41.689 LINK reconnect 00:02:41.965 LINK abort 00:02:41.965 LINK hello_blob 00:02:41.965 LINK nvme_manage 00:02:41.965 LINK accel_perf 00:02:41.965 LINK blobcli 00:02:42.225 LINK cuse 00:02:42.485 CC test/bdev/bdevio/bdevio.o 00:02:42.746 CC examples/bdev/bdevperf/bdevperf.o 00:02:42.746 CC examples/bdev/hello_world/hello_bdev.o 00:02:42.746 LINK bdevio 00:02:43.008 LINK hello_bdev 00:02:43.270 LINK bdevperf 00:02:43.841 CC examples/nvmf/nvmf/nvmf.o 00:02:44.412 LINK nvmf 00:02:45.353 LINK esnap 00:02:45.925 00:02:45.925 real 1m20.195s 00:02:45.925 user 13m57.154s 00:02:45.925 sys 6m55.278s 00:02:45.925 17:14:56 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:45.925 17:14:56 make -- common/autotest_common.sh@10 -- $ set +x 00:02:45.925 ************************************ 00:02:45.925 END TEST make 00:02:45.925 ************************************ 00:02:45.925 17:14:57 -- common/autotest_common.sh@1142 -- $ return 0 00:02:45.925 17:14:57 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:45.925 17:14:57 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:45.925 17:14:57 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:45.925 17:14:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.925 17:14:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:45.925 17:14:57 -- pm/common@44 -- $ pid=2562847 00:02:45.925 17:14:57 -- pm/common@50 -- $ kill -TERM 2562847 00:02:45.925 17:14:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.925 17:14:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:45.925 17:14:57 -- pm/common@44 -- $ pid=2562848 00:02:45.925 17:14:57 -- pm/common@50 -- $ kill -TERM 2562848 00:02:45.925 17:14:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.925 17:14:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:45.925 17:14:57 -- pm/common@44 -- $ pid=2562850 00:02:45.925 17:14:57 -- pm/common@50 -- $ kill -TERM 2562850 00:02:45.925 17:14:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.925 17:14:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:45.925 17:14:57 -- pm/common@44 -- $ pid=2562874 00:02:45.925 17:14:57 -- pm/common@50 -- $ sudo -E kill -TERM 2562874 00:02:45.925 17:14:57 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:45.925 17:14:57 -- nvmf/common.sh@7 -- # uname -s 00:02:45.925 17:14:57 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:45.925 17:14:57 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:45.925 17:14:57 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:45.925 17:14:57 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:45.925 17:14:57 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:45.925 17:14:57 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:45.925 17:14:57 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:45.925 17:14:57 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:45.925 17:14:57 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:45.925 17:14:57 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:45.925 17:14:57 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:02:45.926 17:14:57 -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:02:45.926 17:14:57 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:45.926 17:14:57 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:45.926 17:14:57 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:45.926 17:14:57 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:45.926 17:14:57 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:45.926 17:14:57 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:45.926 17:14:57 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:45.926 17:14:57 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:45.926 17:14:57 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:45.926 17:14:57 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:45.926 17:14:57 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:45.926 17:14:57 -- paths/export.sh@5 -- # export PATH 00:02:45.926 17:14:57 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:45.926 17:14:57 -- nvmf/common.sh@47 -- # : 0 00:02:45.926 17:14:57 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:45.926 17:14:57 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:45.926 17:14:57 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:45.926 17:14:57 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:45.926 17:14:57 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:45.926 17:14:57 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:45.926 17:14:57 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:45.926 17:14:57 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:45.926 17:14:57 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:45.926 17:14:57 -- spdk/autotest.sh@32 -- # uname -s 00:02:45.926 17:14:57 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:45.926 17:14:57 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:45.926 17:14:57 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:45.926 17:14:57 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:45.926 17:14:57 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:45.926 17:14:57 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:45.926 17:14:57 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:45.926 17:14:57 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:45.926 17:14:57 -- spdk/autotest.sh@48 -- # udevadm_pid=2631225 00:02:45.926 17:14:57 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:45.926 17:14:57 -- pm/common@17 -- # local monitor 00:02:45.926 17:14:57 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:45.926 17:14:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.926 17:14:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.926 17:14:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.926 17:14:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.926 17:14:57 -- pm/common@21 -- # date +%s 00:02:45.926 17:14:57 -- pm/common@25 -- # sleep 1 00:02:45.926 17:14:57 -- pm/common@21 -- # date +%s 00:02:45.926 17:14:57 -- pm/common@21 -- # date +%s 00:02:45.926 17:14:57 -- pm/common@21 -- # date +%s 00:02:45.926 17:14:57 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721056497 00:02:45.926 17:14:57 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721056497 00:02:45.926 17:14:57 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721056497 00:02:45.926 17:14:57 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721056497 00:02:46.187 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721056497_collect-vmstat.pm.log 00:02:46.187 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721056497_collect-cpu-load.pm.log 00:02:46.187 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721056497_collect-cpu-temp.pm.log 00:02:46.187 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721056497_collect-bmc-pm.bmc.pm.log 00:02:47.128 17:14:58 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:47.128 17:14:58 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:47.128 17:14:58 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:47.128 17:14:58 -- common/autotest_common.sh@10 -- # set +x 00:02:47.128 17:14:58 -- spdk/autotest.sh@59 -- # create_test_list 00:02:47.128 17:14:58 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:47.128 17:14:58 -- common/autotest_common.sh@10 -- # set +x 00:02:47.128 17:14:58 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:47.128 17:14:58 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:47.128 17:14:58 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:47.128 17:14:58 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:47.128 17:14:58 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:47.128 17:14:58 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:47.128 17:14:58 -- common/autotest_common.sh@1455 -- # uname 00:02:47.128 17:14:58 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:02:47.128 17:14:58 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:47.128 17:14:58 -- common/autotest_common.sh@1475 -- # uname 00:02:47.128 17:14:58 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:02:47.128 17:14:58 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:47.128 17:14:58 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:47.128 17:14:58 -- spdk/autotest.sh@72 -- # hash lcov 00:02:47.128 17:14:58 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:47.128 17:14:58 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:47.128 --rc lcov_branch_coverage=1 00:02:47.128 --rc lcov_function_coverage=1 00:02:47.128 --rc genhtml_branch_coverage=1 00:02:47.128 --rc genhtml_function_coverage=1 00:02:47.128 --rc genhtml_legend=1 00:02:47.128 --rc geninfo_all_blocks=1 00:02:47.128 ' 00:02:47.128 17:14:58 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:47.128 --rc lcov_branch_coverage=1 00:02:47.128 --rc lcov_function_coverage=1 00:02:47.128 --rc genhtml_branch_coverage=1 00:02:47.128 --rc genhtml_function_coverage=1 00:02:47.128 --rc genhtml_legend=1 00:02:47.128 --rc geninfo_all_blocks=1 00:02:47.128 ' 00:02:47.128 17:14:58 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:47.128 --rc lcov_branch_coverage=1 00:02:47.128 --rc lcov_function_coverage=1 00:02:47.128 --rc genhtml_branch_coverage=1 00:02:47.128 --rc genhtml_function_coverage=1 00:02:47.128 --rc genhtml_legend=1 00:02:47.128 --rc geninfo_all_blocks=1 00:02:47.128 --no-external' 00:02:47.128 17:14:58 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:47.128 --rc lcov_branch_coverage=1 00:02:47.128 --rc lcov_function_coverage=1 00:02:47.128 --rc genhtml_branch_coverage=1 00:02:47.128 --rc genhtml_function_coverage=1 00:02:47.128 --rc genhtml_legend=1 00:02:47.128 --rc geninfo_all_blocks=1 00:02:47.128 --no-external' 00:02:47.128 17:14:58 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:47.128 lcov: LCOV version 1.14 00:02:47.128 17:14:58 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:02:57.122 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:02:57.122 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:05.262 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:05.262 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:05.263 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:05.263 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:15.299 17:15:25 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:15.299 17:15:25 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:15.299 17:15:25 -- common/autotest_common.sh@10 -- # set +x 00:03:15.299 17:15:25 -- spdk/autotest.sh@91 -- # rm -f 00:03:15.299 17:15:25 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:18.598 0000:80:01.6 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:80:01.7 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:80:01.4 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:80:01.5 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:80:01.2 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:80:01.3 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:80:01.0 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:80:01.1 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:65:00.0 (8086 0a54): Already using the nvme driver 00:03:18.598 0000:00:01.6 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:00:01.7 (8086 0b00): Already using the ioatdma driver 00:03:18.598 0000:00:01.4 (8086 0b00): Already using the ioatdma driver 00:03:18.859 0000:00:01.5 (8086 0b00): Already using the ioatdma driver 00:03:18.859 0000:00:01.2 (8086 0b00): Already using the ioatdma driver 00:03:18.859 0000:00:01.3 (8086 0b00): Already using the ioatdma driver 00:03:18.859 0000:00:01.0 (8086 0b00): Already using the ioatdma driver 00:03:18.859 0000:00:01.1 (8086 0b00): Already using the ioatdma driver 00:03:18.859 17:15:30 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:18.859 17:15:30 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:18.859 17:15:30 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:18.859 17:15:30 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:18.859 17:15:30 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:18.859 17:15:30 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:18.859 17:15:30 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:18.859 17:15:30 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:18.859 17:15:30 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:18.859 17:15:30 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:18.859 17:15:30 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:18.859 17:15:30 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:18.859 17:15:30 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:18.859 17:15:30 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:18.859 17:15:30 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:18.859 No valid GPT data, bailing 00:03:18.859 17:15:30 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:18.859 17:15:30 -- scripts/common.sh@391 -- # pt= 00:03:18.859 17:15:30 -- scripts/common.sh@392 -- # return 1 00:03:18.859 17:15:30 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:18.859 1+0 records in 00:03:18.859 1+0 records out 00:03:18.859 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00703704 s, 149 MB/s 00:03:18.859 17:15:30 -- spdk/autotest.sh@118 -- # sync 00:03:18.859 17:15:30 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:18.859 17:15:30 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:18.859 17:15:30 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:27.077 17:15:37 -- spdk/autotest.sh@124 -- # uname -s 00:03:27.077 17:15:37 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:27.077 17:15:37 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:27.077 17:15:37 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:27.077 17:15:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:27.077 17:15:37 -- common/autotest_common.sh@10 -- # set +x 00:03:27.077 ************************************ 00:03:27.077 START TEST setup.sh 00:03:27.077 ************************************ 00:03:27.077 17:15:37 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:27.077 * Looking for test storage... 00:03:27.077 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:27.077 17:15:37 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:27.077 17:15:37 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:27.077 17:15:37 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:27.077 17:15:37 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:27.077 17:15:37 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:27.077 17:15:37 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:27.077 ************************************ 00:03:27.077 START TEST acl 00:03:27.077 ************************************ 00:03:27.077 17:15:37 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:27.077 * Looking for test storage... 00:03:27.077 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:27.077 17:15:37 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:27.077 17:15:37 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:27.077 17:15:37 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:27.077 17:15:37 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:27.077 17:15:37 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:27.077 17:15:37 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:27.077 17:15:37 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:27.077 17:15:37 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:27.077 17:15:37 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:27.077 17:15:37 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:27.077 17:15:37 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:27.077 17:15:37 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:27.077 17:15:37 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:27.077 17:15:37 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:27.077 17:15:37 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:27.077 17:15:37 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:31.282 17:15:41 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:31.282 17:15:41 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:31.282 17:15:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:31.282 17:15:41 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:31.282 17:15:41 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:31.282 17:15:41 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:34.601 Hugepages 00:03:34.601 node hugesize free / total 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 00:03:34.601 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.0 == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.1 == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.2 == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.3 == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.4 == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.5 == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.6 == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.7 == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:65:00.0 == *:*:*.* ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:34.601 17:15:45 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.0 == *:*:*.* ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.1 == *:*:*.* ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.2 == *:*:*.* ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.3 == *:*:*.* ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.4 == *:*:*.* ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.5 == *:*:*.* ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.6 == *:*:*.* ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.7 == *:*:*.* ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:34.602 17:15:45 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:34.602 17:15:45 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:34.602 17:15:45 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:34.602 17:15:45 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:34.602 ************************************ 00:03:34.602 START TEST denied 00:03:34.602 ************************************ 00:03:34.602 17:15:45 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:03:34.602 17:15:45 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:65:00.0' 00:03:34.602 17:15:45 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:34.602 17:15:45 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:65:00.0' 00:03:34.602 17:15:45 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.602 17:15:45 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:38.805 0000:65:00.0 (8086 0a54): Skipping denied controller at 0000:65:00.0 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:65:00.0 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:65:00.0 ]] 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:65:00.0/driver 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:38.805 17:15:49 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:44.090 00:03:44.090 real 0m9.311s 00:03:44.090 user 0m3.037s 00:03:44.090 sys 0m5.544s 00:03:44.090 17:15:55 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:44.090 17:15:55 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:44.090 ************************************ 00:03:44.090 END TEST denied 00:03:44.090 ************************************ 00:03:44.090 17:15:55 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:44.090 17:15:55 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:44.090 17:15:55 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:44.090 17:15:55 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:44.090 17:15:55 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:44.090 ************************************ 00:03:44.090 START TEST allowed 00:03:44.090 ************************************ 00:03:44.090 17:15:55 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:03:44.090 17:15:55 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:65:00.0 00:03:44.090 17:15:55 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:44.090 17:15:55 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:65:00.0 .*: nvme -> .*' 00:03:44.090 17:15:55 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:44.090 17:15:55 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:50.665 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:03:50.665 17:16:01 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:50.665 17:16:01 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:50.665 17:16:01 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:50.665 17:16:01 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:50.665 17:16:01 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:54.886 00:03:54.886 real 0m10.054s 00:03:54.886 user 0m3.051s 00:03:54.886 sys 0m5.362s 00:03:54.886 17:16:05 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:54.886 17:16:05 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:54.886 ************************************ 00:03:54.886 END TEST allowed 00:03:54.886 ************************************ 00:03:54.886 17:16:05 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:54.886 00:03:54.886 real 0m27.892s 00:03:54.886 user 0m9.130s 00:03:54.886 sys 0m16.617s 00:03:54.886 17:16:05 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:54.886 17:16:05 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:54.886 ************************************ 00:03:54.886 END TEST acl 00:03:54.886 ************************************ 00:03:54.886 17:16:05 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:03:54.886 17:16:05 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:54.886 17:16:05 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:54.886 17:16:05 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.886 17:16:05 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:54.886 ************************************ 00:03:54.886 START TEST hugepages 00:03:54.886 ************************************ 00:03:54.886 17:16:05 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:54.886 * Looking for test storage... 00:03:54.886 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:54.886 17:16:05 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 108252684 kB' 'MemAvailable: 111470324 kB' 'Buffers: 2704 kB' 'Cached: 9622632 kB' 'SwapCached: 0 kB' 'Active: 6657616 kB' 'Inactive: 3510668 kB' 'Active(anon): 6257628 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 546204 kB' 'Mapped: 173052 kB' 'Shmem: 5714680 kB' 'KReclaimable: 262656 kB' 'Slab: 929600 kB' 'SReclaimable: 262656 kB' 'SUnreclaim: 666944 kB' 'KernelStack: 24896 kB' 'PageTables: 8372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69463468 kB' 'Committed_AS: 7797528 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229652 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.887 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:54.888 17:16:05 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:54.888 17:16:05 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:54.888 17:16:05 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.888 17:16:05 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:54.888 ************************************ 00:03:54.888 START TEST default_setup 00:03:54.888 ************************************ 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:54.888 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:54.889 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:54.889 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:54.889 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:54.889 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:54.889 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:54.889 17:16:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:54.889 17:16:05 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:54.889 17:16:05 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:58.253 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:03:58.253 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:03:58.513 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:04:00.427 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110434104 kB' 'MemAvailable: 113651708 kB' 'Buffers: 2704 kB' 'Cached: 9622780 kB' 'SwapCached: 0 kB' 'Active: 6674616 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274628 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563412 kB' 'Mapped: 171892 kB' 'Shmem: 5714828 kB' 'KReclaimable: 262584 kB' 'Slab: 926696 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664112 kB' 'KernelStack: 24896 kB' 'PageTables: 8500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7817772 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229604 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.427 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110435700 kB' 'MemAvailable: 113653304 kB' 'Buffers: 2704 kB' 'Cached: 9622780 kB' 'SwapCached: 0 kB' 'Active: 6674796 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274808 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563632 kB' 'Mapped: 171892 kB' 'Shmem: 5714828 kB' 'KReclaimable: 262584 kB' 'Slab: 926688 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664104 kB' 'KernelStack: 24864 kB' 'PageTables: 8388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7832804 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229588 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.428 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110432356 kB' 'MemAvailable: 113649960 kB' 'Buffers: 2704 kB' 'Cached: 9622800 kB' 'SwapCached: 0 kB' 'Active: 6675232 kB' 'Inactive: 3510668 kB' 'Active(anon): 6275244 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563580 kB' 'Mapped: 171948 kB' 'Shmem: 5714848 kB' 'KReclaimable: 262584 kB' 'Slab: 926796 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664212 kB' 'KernelStack: 24864 kB' 'PageTables: 8420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7817444 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229588 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.429 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:00.430 nr_hugepages=1024 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:00.430 resv_hugepages=0 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:00.430 surplus_hugepages=0 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:00.430 anon_hugepages=0 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110436776 kB' 'MemAvailable: 113654380 kB' 'Buffers: 2704 kB' 'Cached: 9622840 kB' 'SwapCached: 0 kB' 'Active: 6674336 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274348 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563024 kB' 'Mapped: 171948 kB' 'Shmem: 5714888 kB' 'KReclaimable: 262584 kB' 'Slab: 926796 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664212 kB' 'KernelStack: 24912 kB' 'PageTables: 8488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7817596 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229588 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.430 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60331916 kB' 'MemUsed: 5330084 kB' 'SwapCached: 0 kB' 'Active: 1601580 kB' 'Inactive: 186104 kB' 'Active(anon): 1282268 kB' 'Inactive(anon): 0 kB' 'Active(file): 319312 kB' 'Inactive(file): 186104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1494728 kB' 'Mapped: 103552 kB' 'AnonPages: 296324 kB' 'Shmem: 989312 kB' 'KernelStack: 14024 kB' 'PageTables: 5264 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 137992 kB' 'Slab: 453136 kB' 'SReclaimable: 137992 kB' 'SUnreclaim: 315144 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.431 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:00.432 node0=1024 expecting 1024 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:00.432 00:04:00.432 real 0m6.047s 00:04:00.432 user 0m1.695s 00:04:00.432 sys 0m2.648s 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:00.432 17:16:11 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:00.432 ************************************ 00:04:00.432 END TEST default_setup 00:04:00.432 ************************************ 00:04:00.432 17:16:11 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:00.432 17:16:11 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:00.432 17:16:11 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:00.432 17:16:11 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:00.432 17:16:11 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:00.692 ************************************ 00:04:00.692 START TEST per_node_1G_alloc 00:04:00.692 ************************************ 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:00.692 17:16:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:04.895 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:04.895 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:04.895 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:04.895 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:04.895 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:04.895 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110474256 kB' 'MemAvailable: 113691860 kB' 'Buffers: 2704 kB' 'Cached: 9622948 kB' 'SwapCached: 0 kB' 'Active: 6674696 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274708 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562356 kB' 'Mapped: 171332 kB' 'Shmem: 5714996 kB' 'KReclaimable: 262584 kB' 'Slab: 927228 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664644 kB' 'KernelStack: 25184 kB' 'PageTables: 9032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7803720 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229908 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.896 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110475512 kB' 'MemAvailable: 113693116 kB' 'Buffers: 2704 kB' 'Cached: 9622948 kB' 'SwapCached: 0 kB' 'Active: 6674456 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274468 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562148 kB' 'Mapped: 171332 kB' 'Shmem: 5714996 kB' 'KReclaimable: 262584 kB' 'Slab: 927212 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664628 kB' 'KernelStack: 24912 kB' 'PageTables: 8140 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7802136 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229748 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.897 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.898 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110474836 kB' 'MemAvailable: 113692440 kB' 'Buffers: 2704 kB' 'Cached: 9622972 kB' 'SwapCached: 0 kB' 'Active: 6673500 kB' 'Inactive: 3510668 kB' 'Active(anon): 6273512 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561696 kB' 'Mapped: 171248 kB' 'Shmem: 5715020 kB' 'KReclaimable: 262584 kB' 'Slab: 927412 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664828 kB' 'KernelStack: 25088 kB' 'PageTables: 9092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7803760 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229828 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.899 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.900 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:04.901 nr_hugepages=1024 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:04.901 resv_hugepages=0 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:04.901 surplus_hugepages=0 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:04.901 anon_hugepages=0 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110474904 kB' 'MemAvailable: 113692508 kB' 'Buffers: 2704 kB' 'Cached: 9622996 kB' 'SwapCached: 0 kB' 'Active: 6673792 kB' 'Inactive: 3510668 kB' 'Active(anon): 6273804 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561952 kB' 'Mapped: 171248 kB' 'Shmem: 5715044 kB' 'KReclaimable: 262584 kB' 'Slab: 927412 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664828 kB' 'KernelStack: 25168 kB' 'PageTables: 9112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7803784 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229844 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.901 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.902 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 61404636 kB' 'MemUsed: 4257364 kB' 'SwapCached: 0 kB' 'Active: 1598004 kB' 'Inactive: 186104 kB' 'Active(anon): 1278692 kB' 'Inactive(anon): 0 kB' 'Active(file): 319312 kB' 'Inactive(file): 186104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1494816 kB' 'Mapped: 103460 kB' 'AnonPages: 292476 kB' 'Shmem: 989400 kB' 'KernelStack: 14024 kB' 'PageTables: 5132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 137992 kB' 'Slab: 453460 kB' 'SReclaimable: 137992 kB' 'SUnreclaim: 315468 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.903 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:04.904 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 49070996 kB' 'MemUsed: 11611040 kB' 'SwapCached: 0 kB' 'Active: 5075556 kB' 'Inactive: 3324564 kB' 'Active(anon): 4994880 kB' 'Inactive(anon): 0 kB' 'Active(file): 80676 kB' 'Inactive(file): 3324564 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8130900 kB' 'Mapped: 67788 kB' 'AnonPages: 269224 kB' 'Shmem: 4725660 kB' 'KernelStack: 11128 kB' 'PageTables: 3840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124592 kB' 'Slab: 473944 kB' 'SReclaimable: 124592 kB' 'SUnreclaim: 349352 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.905 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:04.906 node0=512 expecting 512 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:04.906 node1=512 expecting 512 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:04.906 00:04:04.906 real 0m4.145s 00:04:04.906 user 0m1.646s 00:04:04.906 sys 0m2.573s 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:04.906 17:16:15 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:04.906 ************************************ 00:04:04.906 END TEST per_node_1G_alloc 00:04:04.906 ************************************ 00:04:04.906 17:16:15 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:04.906 17:16:15 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:04.906 17:16:15 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:04.906 17:16:15 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:04.906 17:16:15 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:04.906 ************************************ 00:04:04.906 START TEST even_2G_alloc 00:04:04.906 ************************************ 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:04.906 17:16:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:09.115 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:09.115 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110474288 kB' 'MemAvailable: 113691892 kB' 'Buffers: 2704 kB' 'Cached: 9623132 kB' 'SwapCached: 0 kB' 'Active: 6674328 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274340 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561984 kB' 'Mapped: 171340 kB' 'Shmem: 5715180 kB' 'KReclaimable: 262584 kB' 'Slab: 927332 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664748 kB' 'KernelStack: 24928 kB' 'PageTables: 8440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7801576 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229668 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.115 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110476356 kB' 'MemAvailable: 113693960 kB' 'Buffers: 2704 kB' 'Cached: 9623148 kB' 'SwapCached: 0 kB' 'Active: 6674224 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274236 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561944 kB' 'Mapped: 171316 kB' 'Shmem: 5715196 kB' 'KReclaimable: 262584 kB' 'Slab: 927300 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664716 kB' 'KernelStack: 24880 kB' 'PageTables: 8260 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7801752 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229636 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:19 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.116 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.117 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110484484 kB' 'MemAvailable: 113702088 kB' 'Buffers: 2704 kB' 'Cached: 9623164 kB' 'SwapCached: 0 kB' 'Active: 6674244 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274256 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563152 kB' 'Mapped: 171248 kB' 'Shmem: 5715212 kB' 'KReclaimable: 262584 kB' 'Slab: 927288 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664704 kB' 'KernelStack: 24864 kB' 'PageTables: 8204 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7801984 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229572 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.118 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:09.119 nr_hugepages=1024 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:09.119 resv_hugepages=0 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:09.119 surplus_hugepages=0 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:09.119 anon_hugepages=0 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110485376 kB' 'MemAvailable: 113702980 kB' 'Buffers: 2704 kB' 'Cached: 9623184 kB' 'SwapCached: 0 kB' 'Active: 6674588 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274600 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563072 kB' 'Mapped: 171240 kB' 'Shmem: 5715232 kB' 'KReclaimable: 262584 kB' 'Slab: 927288 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664704 kB' 'KernelStack: 24832 kB' 'PageTables: 8104 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7802004 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229604 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.119 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:09.120 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 61409140 kB' 'MemUsed: 4252860 kB' 'SwapCached: 0 kB' 'Active: 1599140 kB' 'Inactive: 186104 kB' 'Active(anon): 1279828 kB' 'Inactive(anon): 0 kB' 'Active(file): 319312 kB' 'Inactive(file): 186104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1494960 kB' 'Mapped: 103460 kB' 'AnonPages: 293804 kB' 'Shmem: 989544 kB' 'KernelStack: 13992 kB' 'PageTables: 5036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 137992 kB' 'Slab: 453424 kB' 'SReclaimable: 137992 kB' 'SUnreclaim: 315432 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.121 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 49076440 kB' 'MemUsed: 11605596 kB' 'SwapCached: 0 kB' 'Active: 5076356 kB' 'Inactive: 3324564 kB' 'Active(anon): 4995680 kB' 'Inactive(anon): 0 kB' 'Active(file): 80676 kB' 'Inactive(file): 3324564 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8130952 kB' 'Mapped: 67780 kB' 'AnonPages: 270272 kB' 'Shmem: 4725712 kB' 'KernelStack: 10872 kB' 'PageTables: 3168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124592 kB' 'Slab: 473864 kB' 'SReclaimable: 124592 kB' 'SUnreclaim: 349272 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.122 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:09.123 node0=512 expecting 512 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:09.123 node1=512 expecting 512 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:09.123 00:04:09.123 real 0m4.168s 00:04:09.123 user 0m1.605s 00:04:09.123 sys 0m2.639s 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:09.123 17:16:20 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:09.123 ************************************ 00:04:09.123 END TEST even_2G_alloc 00:04:09.123 ************************************ 00:04:09.123 17:16:20 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:09.123 17:16:20 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:09.123 17:16:20 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:09.123 17:16:20 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:09.123 17:16:20 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:09.123 ************************************ 00:04:09.123 START TEST odd_alloc 00:04:09.123 ************************************ 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:09.123 17:16:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:13.329 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:13.329 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110495692 kB' 'MemAvailable: 113713296 kB' 'Buffers: 2704 kB' 'Cached: 9623316 kB' 'SwapCached: 0 kB' 'Active: 6674652 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274664 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562012 kB' 'Mapped: 171408 kB' 'Shmem: 5715364 kB' 'KReclaimable: 262584 kB' 'Slab: 927328 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664744 kB' 'KernelStack: 24912 kB' 'PageTables: 8320 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7802764 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229652 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.329 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.330 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110496520 kB' 'MemAvailable: 113714124 kB' 'Buffers: 2704 kB' 'Cached: 9623320 kB' 'SwapCached: 0 kB' 'Active: 6675680 kB' 'Inactive: 3510668 kB' 'Active(anon): 6275692 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563032 kB' 'Mapped: 171836 kB' 'Shmem: 5715368 kB' 'KReclaimable: 262584 kB' 'Slab: 927312 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664728 kB' 'KernelStack: 24864 kB' 'PageTables: 8160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7804928 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229620 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.331 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.332 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110493108 kB' 'MemAvailable: 113710712 kB' 'Buffers: 2704 kB' 'Cached: 9623336 kB' 'SwapCached: 0 kB' 'Active: 6678828 kB' 'Inactive: 3510668 kB' 'Active(anon): 6278840 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 567200 kB' 'Mapped: 171760 kB' 'Shmem: 5715384 kB' 'KReclaimable: 262584 kB' 'Slab: 927296 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664712 kB' 'KernelStack: 24880 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7808920 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229608 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.333 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:13.334 nr_hugepages=1025 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:13.334 resv_hugepages=0 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:13.334 surplus_hugepages=0 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:13.334 anon_hugepages=0 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.334 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110499764 kB' 'MemAvailable: 113717368 kB' 'Buffers: 2704 kB' 'Cached: 9623372 kB' 'SwapCached: 0 kB' 'Active: 6673412 kB' 'Inactive: 3510668 kB' 'Active(anon): 6273424 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561272 kB' 'Mapped: 171604 kB' 'Shmem: 5715420 kB' 'KReclaimable: 262584 kB' 'Slab: 927296 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664712 kB' 'KernelStack: 24848 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7802824 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229604 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.335 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 61410768 kB' 'MemUsed: 4251232 kB' 'SwapCached: 0 kB' 'Active: 1599512 kB' 'Inactive: 186104 kB' 'Active(anon): 1280200 kB' 'Inactive(anon): 0 kB' 'Active(file): 319312 kB' 'Inactive(file): 186104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1495092 kB' 'Mapped: 103460 kB' 'AnonPages: 293736 kB' 'Shmem: 989676 kB' 'KernelStack: 13976 kB' 'PageTables: 4984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 137992 kB' 'Slab: 453480 kB' 'SReclaimable: 137992 kB' 'SUnreclaim: 315488 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.336 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.337 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 49089840 kB' 'MemUsed: 11592196 kB' 'SwapCached: 0 kB' 'Active: 5074288 kB' 'Inactive: 3324564 kB' 'Active(anon): 4993612 kB' 'Inactive(anon): 0 kB' 'Active(file): 80676 kB' 'Inactive(file): 3324564 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8130988 kB' 'Mapped: 67796 kB' 'AnonPages: 267888 kB' 'Shmem: 4725748 kB' 'KernelStack: 10904 kB' 'PageTables: 3212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124592 kB' 'Slab: 473800 kB' 'SReclaimable: 124592 kB' 'SUnreclaim: 349208 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.338 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:13.339 node0=512 expecting 513 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:13.339 node1=513 expecting 512 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:13.339 00:04:13.339 real 0m4.116s 00:04:13.339 user 0m1.618s 00:04:13.339 sys 0m2.572s 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:13.339 17:16:24 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:13.339 ************************************ 00:04:13.339 END TEST odd_alloc 00:04:13.339 ************************************ 00:04:13.339 17:16:24 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:13.339 17:16:24 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:13.339 17:16:24 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:13.339 17:16:24 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:13.339 17:16:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:13.339 ************************************ 00:04:13.339 START TEST custom_alloc 00:04:13.339 ************************************ 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:13.339 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:13.340 17:16:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:13.340 17:16:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.340 17:16:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:17.547 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:17.547 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109447800 kB' 'MemAvailable: 112665404 kB' 'Buffers: 2704 kB' 'Cached: 9623496 kB' 'SwapCached: 0 kB' 'Active: 6674720 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274732 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562456 kB' 'Mapped: 171336 kB' 'Shmem: 5715544 kB' 'KReclaimable: 262584 kB' 'Slab: 926896 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664312 kB' 'KernelStack: 24816 kB' 'PageTables: 7988 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7804960 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229620 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.547 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.548 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109448128 kB' 'MemAvailable: 112665732 kB' 'Buffers: 2704 kB' 'Cached: 9623496 kB' 'SwapCached: 0 kB' 'Active: 6674904 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274916 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562620 kB' 'Mapped: 171304 kB' 'Shmem: 5715544 kB' 'KReclaimable: 262584 kB' 'Slab: 926904 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664320 kB' 'KernelStack: 24848 kB' 'PageTables: 8092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7804976 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229588 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.549 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.550 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109448380 kB' 'MemAvailable: 112665984 kB' 'Buffers: 2704 kB' 'Cached: 9623516 kB' 'SwapCached: 0 kB' 'Active: 6674776 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274788 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562440 kB' 'Mapped: 171304 kB' 'Shmem: 5715564 kB' 'KReclaimable: 262584 kB' 'Slab: 926904 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664320 kB' 'KernelStack: 24832 kB' 'PageTables: 8040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7805000 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229572 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.551 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.552 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:17.553 nr_hugepages=1536 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:17.553 resv_hugepages=0 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:17.553 surplus_hugepages=0 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:17.553 anon_hugepages=0 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109448380 kB' 'MemAvailable: 112665984 kB' 'Buffers: 2704 kB' 'Cached: 9623532 kB' 'SwapCached: 0 kB' 'Active: 6674896 kB' 'Inactive: 3510668 kB' 'Active(anon): 6274908 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562540 kB' 'Mapped: 171304 kB' 'Shmem: 5715580 kB' 'KReclaimable: 262584 kB' 'Slab: 926904 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664320 kB' 'KernelStack: 24864 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7805020 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229572 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.553 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.554 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 61399216 kB' 'MemUsed: 4262784 kB' 'SwapCached: 0 kB' 'Active: 1600236 kB' 'Inactive: 186104 kB' 'Active(anon): 1280924 kB' 'Inactive(anon): 0 kB' 'Active(file): 319312 kB' 'Inactive(file): 186104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1495260 kB' 'Mapped: 103460 kB' 'AnonPages: 294260 kB' 'Shmem: 989844 kB' 'KernelStack: 13976 kB' 'PageTables: 4980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 137992 kB' 'Slab: 453184 kB' 'SReclaimable: 137992 kB' 'SUnreclaim: 315192 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.555 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 48048312 kB' 'MemUsed: 12633724 kB' 'SwapCached: 0 kB' 'Active: 5076476 kB' 'Inactive: 3324564 kB' 'Active(anon): 4995800 kB' 'Inactive(anon): 0 kB' 'Active(file): 80676 kB' 'Inactive(file): 3324564 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8131020 kB' 'Mapped: 68348 kB' 'AnonPages: 270048 kB' 'Shmem: 4725780 kB' 'KernelStack: 10888 kB' 'PageTables: 3164 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124592 kB' 'Slab: 473720 kB' 'SReclaimable: 124592 kB' 'SUnreclaim: 349128 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.556 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.557 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.558 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.558 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:17.558 node0=512 expecting 512 00:04:17.558 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.558 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.558 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.558 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:17.558 node1=1024 expecting 1024 00:04:17.558 17:16:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:17.558 00:04:17.558 real 0m4.148s 00:04:17.558 user 0m1.664s 00:04:17.558 sys 0m2.565s 00:04:17.558 17:16:28 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:17.558 17:16:28 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:17.558 ************************************ 00:04:17.558 END TEST custom_alloc 00:04:17.558 ************************************ 00:04:17.558 17:16:28 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:17.558 17:16:28 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:17.558 17:16:28 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:17.558 17:16:28 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:17.558 17:16:28 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:17.558 ************************************ 00:04:17.558 START TEST no_shrink_alloc 00:04:17.558 ************************************ 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.558 17:16:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:21.761 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:21.761 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.761 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110484348 kB' 'MemAvailable: 113701952 kB' 'Buffers: 2704 kB' 'Cached: 9623660 kB' 'SwapCached: 0 kB' 'Active: 6676796 kB' 'Inactive: 3510668 kB' 'Active(anon): 6276808 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563936 kB' 'Mapped: 171420 kB' 'Shmem: 5715708 kB' 'KReclaimable: 262584 kB' 'Slab: 927096 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664512 kB' 'KernelStack: 25040 kB' 'PageTables: 8692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7807548 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229908 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.762 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110487040 kB' 'MemAvailable: 113704644 kB' 'Buffers: 2704 kB' 'Cached: 9623664 kB' 'SwapCached: 0 kB' 'Active: 6676640 kB' 'Inactive: 3510668 kB' 'Active(anon): 6276652 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563756 kB' 'Mapped: 171392 kB' 'Shmem: 5715712 kB' 'KReclaimable: 262584 kB' 'Slab: 927080 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664496 kB' 'KernelStack: 25056 kB' 'PageTables: 8900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7807564 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229876 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.763 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.764 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110488436 kB' 'MemAvailable: 113706040 kB' 'Buffers: 2704 kB' 'Cached: 9623684 kB' 'SwapCached: 0 kB' 'Active: 6675992 kB' 'Inactive: 3510668 kB' 'Active(anon): 6276004 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563524 kB' 'Mapped: 171324 kB' 'Shmem: 5715732 kB' 'KReclaimable: 262584 kB' 'Slab: 927068 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664484 kB' 'KernelStack: 24896 kB' 'PageTables: 7960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7807588 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229796 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.765 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.766 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:21.767 nr_hugepages=1024 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:21.767 resv_hugepages=0 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:21.767 surplus_hugepages=0 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:21.767 anon_hugepages=0 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110488080 kB' 'MemAvailable: 113705684 kB' 'Buffers: 2704 kB' 'Cached: 9623704 kB' 'SwapCached: 0 kB' 'Active: 6676440 kB' 'Inactive: 3510668 kB' 'Active(anon): 6276452 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563976 kB' 'Mapped: 171324 kB' 'Shmem: 5715752 kB' 'KReclaimable: 262584 kB' 'Slab: 927068 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664484 kB' 'KernelStack: 25120 kB' 'PageTables: 8864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7807608 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229940 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.767 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.768 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60342408 kB' 'MemUsed: 5319592 kB' 'SwapCached: 0 kB' 'Active: 1599072 kB' 'Inactive: 186104 kB' 'Active(anon): 1279760 kB' 'Inactive(anon): 0 kB' 'Active(file): 319312 kB' 'Inactive(file): 186104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1495368 kB' 'Mapped: 103460 kB' 'AnonPages: 293000 kB' 'Shmem: 989952 kB' 'KernelStack: 13960 kB' 'PageTables: 4936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 137992 kB' 'Slab: 453236 kB' 'SReclaimable: 137992 kB' 'SUnreclaim: 315244 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.769 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:21.771 node0=1024 expecting 1024 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.771 17:16:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:26.018 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:26.018 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:26.018 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110509376 kB' 'MemAvailable: 113726980 kB' 'Buffers: 2704 kB' 'Cached: 9623816 kB' 'SwapCached: 0 kB' 'Active: 6678100 kB' 'Inactive: 3510668 kB' 'Active(anon): 6278112 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564980 kB' 'Mapped: 171512 kB' 'Shmem: 5715864 kB' 'KReclaimable: 262584 kB' 'Slab: 927356 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664772 kB' 'KernelStack: 24912 kB' 'PageTables: 8308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7808180 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229828 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.018 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110510376 kB' 'MemAvailable: 113727980 kB' 'Buffers: 2704 kB' 'Cached: 9623820 kB' 'SwapCached: 0 kB' 'Active: 6677276 kB' 'Inactive: 3510668 kB' 'Active(anon): 6277288 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564164 kB' 'Mapped: 171448 kB' 'Shmem: 5715868 kB' 'KReclaimable: 262584 kB' 'Slab: 927176 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664592 kB' 'KernelStack: 24816 kB' 'PageTables: 7984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7805472 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229732 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.019 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110510376 kB' 'MemAvailable: 113727980 kB' 'Buffers: 2704 kB' 'Cached: 9623836 kB' 'SwapCached: 0 kB' 'Active: 6676776 kB' 'Inactive: 3510668 kB' 'Active(anon): 6276788 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564160 kB' 'Mapped: 171348 kB' 'Shmem: 5715884 kB' 'KReclaimable: 262584 kB' 'Slab: 927208 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664624 kB' 'KernelStack: 24896 kB' 'PageTables: 8232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7805496 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229716 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.020 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.021 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:26.022 nr_hugepages=1024 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:26.022 resv_hugepages=0 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:26.022 surplus_hugepages=0 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:26.022 anon_hugepages=0 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110510656 kB' 'MemAvailable: 113728260 kB' 'Buffers: 2704 kB' 'Cached: 9623876 kB' 'SwapCached: 0 kB' 'Active: 6676956 kB' 'Inactive: 3510668 kB' 'Active(anon): 6276968 kB' 'Inactive(anon): 0 kB' 'Active(file): 399988 kB' 'Inactive(file): 3510668 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564308 kB' 'Mapped: 171348 kB' 'Shmem: 5715924 kB' 'KReclaimable: 262584 kB' 'Slab: 927208 kB' 'SReclaimable: 262584 kB' 'SUnreclaim: 664624 kB' 'KernelStack: 24896 kB' 'PageTables: 8224 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7805888 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229716 kB' 'VmallocChunk: 0 kB' 'Percpu: 98816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2613540 kB' 'DirectMap2M: 23281664 kB' 'DirectMap1G: 110100480 kB' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.022 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.023 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60366724 kB' 'MemUsed: 5295276 kB' 'SwapCached: 0 kB' 'Active: 1600868 kB' 'Inactive: 186104 kB' 'Active(anon): 1281556 kB' 'Inactive(anon): 0 kB' 'Active(file): 319312 kB' 'Inactive(file): 186104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1495492 kB' 'Mapped: 103496 kB' 'AnonPages: 294652 kB' 'Shmem: 990076 kB' 'KernelStack: 13992 kB' 'PageTables: 5000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 137992 kB' 'Slab: 452940 kB' 'SReclaimable: 137992 kB' 'SUnreclaim: 314948 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.024 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:26.025 node0=1024 expecting 1024 00:04:26.025 17:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:26.025 00:04:26.025 real 0m8.142s 00:04:26.025 user 0m3.184s 00:04:26.025 sys 0m5.094s 00:04:26.026 17:16:36 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:26.026 17:16:36 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:26.026 ************************************ 00:04:26.026 END TEST no_shrink_alloc 00:04:26.026 ************************************ 00:04:26.026 17:16:36 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:26.026 17:16:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:26.026 00:04:26.026 real 0m31.404s 00:04:26.026 user 0m11.662s 00:04:26.026 sys 0m18.517s 00:04:26.026 17:16:36 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:26.026 17:16:36 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:26.026 ************************************ 00:04:26.026 END TEST hugepages 00:04:26.026 ************************************ 00:04:26.026 17:16:36 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:26.026 17:16:36 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:26.026 17:16:36 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:26.026 17:16:36 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:26.026 17:16:36 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:26.026 ************************************ 00:04:26.026 START TEST driver 00:04:26.026 ************************************ 00:04:26.026 17:16:36 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:26.026 * Looking for test storage... 00:04:26.026 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:26.026 17:16:37 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:26.026 17:16:37 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:26.026 17:16:37 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:31.326 17:16:41 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:31.326 17:16:41 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:31.326 17:16:41 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:31.326 17:16:41 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:31.326 ************************************ 00:04:31.326 START TEST guess_driver 00:04:31.326 ************************************ 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 370 > 0 )) 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:31.326 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:31.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:31.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:31.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:31.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:31.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:31.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:31.326 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:31.327 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:31.327 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:31.327 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:31.327 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:31.327 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:31.327 Looking for driver=vfio-pci 00:04:31.327 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:31.327 17:16:41 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:31.327 17:16:41 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.327 17:16:41 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:34.626 17:16:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.537 17:16:47 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.537 17:16:47 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.537 17:16:47 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.798 17:16:47 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:36.798 17:16:47 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:36.798 17:16:47 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:36.798 17:16:47 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:42.093 00:04:42.093 real 0m11.298s 00:04:42.093 user 0m3.041s 00:04:42.093 sys 0m5.607s 00:04:42.093 17:16:53 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:42.093 17:16:53 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:42.093 ************************************ 00:04:42.093 END TEST guess_driver 00:04:42.093 ************************************ 00:04:42.093 17:16:53 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:04:42.093 00:04:42.093 real 0m16.257s 00:04:42.093 user 0m4.361s 00:04:42.093 sys 0m8.408s 00:04:42.093 17:16:53 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:42.093 17:16:53 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:42.093 ************************************ 00:04:42.093 END TEST driver 00:04:42.093 ************************************ 00:04:42.093 17:16:53 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:42.093 17:16:53 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:42.093 17:16:53 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:42.093 17:16:53 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:42.093 17:16:53 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:42.093 ************************************ 00:04:42.093 START TEST devices 00:04:42.093 ************************************ 00:04:42.093 17:16:53 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:42.093 * Looking for test storage... 00:04:42.093 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:42.093 17:16:53 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:42.093 17:16:53 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:42.093 17:16:53 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:42.093 17:16:53 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:47.380 17:16:57 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:47.380 17:16:57 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:47.380 17:16:57 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:47.380 17:16:57 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:47.380 17:16:57 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:47.380 17:16:57 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:47.380 17:16:57 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:47.380 17:16:57 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:47.380 17:16:57 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:47.380 17:16:57 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:47.380 17:16:57 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:47.380 17:16:57 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:47.380 17:16:57 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:47.380 17:16:57 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:47.380 17:16:57 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:47.380 17:16:57 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:47.380 17:16:57 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:65:00.0 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:47.381 17:16:57 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:47.381 17:16:57 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:47.381 No valid GPT data, bailing 00:04:47.381 17:16:57 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:47.381 17:16:57 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:47.381 17:16:57 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:47.381 17:16:57 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:47.381 17:16:57 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:47.381 17:16:57 setup.sh.devices -- setup/common.sh@80 -- # echo 2000398934016 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@204 -- # (( 2000398934016 >= min_disk_size )) 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:65:00.0 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:47.381 17:16:57 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:47.381 17:16:57 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:47.381 17:16:57 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:47.381 17:16:57 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:47.381 ************************************ 00:04:47.381 START TEST nvme_mount 00:04:47.381 ************************************ 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:47.381 17:16:57 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:47.641 Creating new GPT entries in memory. 00:04:47.641 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:47.641 other utilities. 00:04:47.641 17:16:58 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:47.641 17:16:58 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:47.641 17:16:58 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:47.641 17:16:58 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:47.641 17:16:58 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:48.581 Creating new GPT entries in memory. 00:04:48.581 The operation has completed successfully. 00:04:48.581 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:48.581 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:48.581 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 2671466 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:65:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.841 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:04:48.842 17:16:59 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:48.842 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:48.842 17:16:59 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.139 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:53.140 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:53.140 17:17:03 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:53.140 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:53.140 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:04:53.140 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:53.140 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:65:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:53.140 17:17:04 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.449 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.711 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:65:00.0 data@nvme0n1 '' '' 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:56.712 17:17:07 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.917 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:00.918 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:00.918 00:05:00.918 real 0m14.068s 00:05:00.918 user 0m4.277s 00:05:00.918 sys 0m7.653s 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:00.918 17:17:11 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:00.918 ************************************ 00:05:00.918 END TEST nvme_mount 00:05:00.918 ************************************ 00:05:00.918 17:17:11 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:00.918 17:17:11 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:00.918 17:17:11 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:00.918 17:17:11 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:00.918 17:17:11 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:00.918 ************************************ 00:05:00.918 START TEST dm_mount 00:05:00.918 ************************************ 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:00.918 17:17:11 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:01.861 Creating new GPT entries in memory. 00:05:01.861 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:01.861 other utilities. 00:05:01.861 17:17:12 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:01.861 17:17:12 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:01.861 17:17:12 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:01.861 17:17:12 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:01.861 17:17:12 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:02.802 Creating new GPT entries in memory. 00:05:02.802 The operation has completed successfully. 00:05:02.802 17:17:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:02.802 17:17:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:02.802 17:17:14 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:02.802 17:17:14 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:02.802 17:17:14 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:03.744 The operation has completed successfully. 00:05:03.744 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:03.744 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:03.744 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 2676559 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:65:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.006 17:17:15 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:08.211 17:17:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:65:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:08.211 17:17:19 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:12.426 17:17:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:12.426 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:12.426 00:05:12.426 real 0m11.134s 00:05:12.426 user 0m2.921s 00:05:12.426 sys 0m5.287s 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:12.426 17:17:23 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:12.426 ************************************ 00:05:12.426 END TEST dm_mount 00:05:12.426 ************************************ 00:05:12.426 17:17:23 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:12.426 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:12.426 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:05:12.426 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:12.426 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:12.426 17:17:23 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:12.426 00:05:12.426 real 0m30.185s 00:05:12.426 user 0m8.971s 00:05:12.426 sys 0m16.043s 00:05:12.426 17:17:23 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:12.426 17:17:23 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:12.426 ************************************ 00:05:12.426 END TEST devices 00:05:12.426 ************************************ 00:05:12.426 17:17:23 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:12.426 00:05:12.426 real 1m46.175s 00:05:12.426 user 0m34.284s 00:05:12.426 sys 0m59.886s 00:05:12.426 17:17:23 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:12.426 17:17:23 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:12.426 ************************************ 00:05:12.426 END TEST setup.sh 00:05:12.426 ************************************ 00:05:12.426 17:17:23 -- common/autotest_common.sh@1142 -- # return 0 00:05:12.426 17:17:23 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:16.718 Hugepages 00:05:16.718 node hugesize free / total 00:05:16.718 node0 1048576kB 0 / 0 00:05:16.718 node0 2048kB 1024 / 1024 00:05:16.718 node1 1048576kB 0 / 0 00:05:16.718 node1 2048kB 1024 / 1024 00:05:16.718 00:05:16.718 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:16.718 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:05:16.718 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:05:16.718 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:05:16.718 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:05:16.718 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:05:16.718 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:05:16.718 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:05:16.718 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:05:16.718 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:05:16.718 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:05:16.718 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:05:16.718 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:05:16.718 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:05:16.718 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:05:16.718 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:05:16.718 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:05:16.718 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:05:16.718 17:17:27 -- spdk/autotest.sh@130 -- # uname -s 00:05:16.718 17:17:27 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:16.718 17:17:27 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:16.718 17:17:27 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:20.921 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:20.921 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:22.304 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:05:22.564 17:17:33 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:23.505 17:17:34 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:23.505 17:17:34 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:23.505 17:17:34 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:23.505 17:17:34 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:23.505 17:17:34 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:23.505 17:17:34 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:23.505 17:17:34 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:23.505 17:17:34 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:23.505 17:17:34 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:23.505 17:17:34 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:23.505 17:17:34 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:65:00.0 00:05:23.505 17:17:34 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:27.707 Waiting for block devices as requested 00:05:27.707 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:05:27.707 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:05:27.707 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:05:27.707 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:05:27.707 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:05:27.967 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:05:27.967 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:05:27.967 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:05:28.227 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:05:28.227 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:05:28.227 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:05:28.488 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:05:28.488 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:05:28.488 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:05:28.749 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:05:28.749 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:05:28.749 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:05:28.749 17:17:40 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:28.749 17:17:40 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:65:00.0 00:05:28.749 17:17:40 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:05:28.749 17:17:40 -- common/autotest_common.sh@1502 -- # grep 0000:65:00.0/nvme/nvme 00:05:28.749 17:17:40 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:05:28.749 17:17:40 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 ]] 00:05:29.010 17:17:40 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:05:29.010 17:17:40 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:29.010 17:17:40 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:29.010 17:17:40 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:29.010 17:17:40 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:29.010 17:17:40 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:29.010 17:17:40 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:29.010 17:17:40 -- common/autotest_common.sh@1545 -- # oacs=' 0xe' 00:05:29.010 17:17:40 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:29.010 17:17:40 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:29.010 17:17:40 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:29.010 17:17:40 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:29.010 17:17:40 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:29.010 17:17:40 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:29.010 17:17:40 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:29.010 17:17:40 -- common/autotest_common.sh@1557 -- # continue 00:05:29.010 17:17:40 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:29.010 17:17:40 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:29.010 17:17:40 -- common/autotest_common.sh@10 -- # set +x 00:05:29.010 17:17:40 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:29.010 17:17:40 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:29.010 17:17:40 -- common/autotest_common.sh@10 -- # set +x 00:05:29.010 17:17:40 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:33.225 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:33.225 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:34.608 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:05:34.869 17:17:45 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:34.869 17:17:45 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:34.869 17:17:45 -- common/autotest_common.sh@10 -- # set +x 00:05:34.869 17:17:45 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:34.869 17:17:45 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:05:34.869 17:17:45 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:05:34.869 17:17:45 -- common/autotest_common.sh@1577 -- # bdfs=() 00:05:34.869 17:17:45 -- common/autotest_common.sh@1577 -- # local bdfs 00:05:34.869 17:17:45 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:05:34.869 17:17:45 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:34.869 17:17:45 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:34.869 17:17:45 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:34.869 17:17:45 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:34.869 17:17:45 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:34.869 17:17:46 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:34.869 17:17:46 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:65:00.0 00:05:34.869 17:17:46 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:05:34.869 17:17:46 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:65:00.0/device 00:05:34.869 17:17:46 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:05:34.869 17:17:46 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:34.869 17:17:46 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:05:34.869 17:17:46 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:65:00.0 00:05:34.869 17:17:46 -- common/autotest_common.sh@1592 -- # [[ -z 0000:65:00.0 ]] 00:05:34.869 17:17:46 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=2687776 00:05:34.869 17:17:46 -- common/autotest_common.sh@1598 -- # waitforlisten 2687776 00:05:34.869 17:17:46 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:34.869 17:17:46 -- common/autotest_common.sh@829 -- # '[' -z 2687776 ']' 00:05:34.869 17:17:46 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:34.869 17:17:46 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:34.869 17:17:46 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:34.869 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:34.869 17:17:46 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:34.869 17:17:46 -- common/autotest_common.sh@10 -- # set +x 00:05:34.869 [2024-07-15 17:17:46.138639] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:05:34.869 [2024-07-15 17:17:46.138703] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687776 ] 00:05:35.130 [2024-07-15 17:17:46.224507] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.130 [2024-07-15 17:17:46.319046] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:35.700 17:17:46 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:35.700 17:17:46 -- common/autotest_common.sh@862 -- # return 0 00:05:35.700 17:17:46 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:05:35.700 17:17:46 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:05:35.700 17:17:46 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:65:00.0 00:05:38.999 nvme0n1 00:05:38.999 17:17:49 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:38.999 [2024-07-15 17:17:50.165280] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:38.999 request: 00:05:38.999 { 00:05:38.999 "nvme_ctrlr_name": "nvme0", 00:05:38.999 "password": "test", 00:05:38.999 "method": "bdev_nvme_opal_revert", 00:05:38.999 "req_id": 1 00:05:38.999 } 00:05:38.999 Got JSON-RPC error response 00:05:38.999 response: 00:05:38.999 { 00:05:38.999 "code": -32602, 00:05:38.999 "message": "Invalid parameters" 00:05:38.999 } 00:05:38.999 17:17:50 -- common/autotest_common.sh@1604 -- # true 00:05:38.999 17:17:50 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:05:38.999 17:17:50 -- common/autotest_common.sh@1608 -- # killprocess 2687776 00:05:38.999 17:17:50 -- common/autotest_common.sh@948 -- # '[' -z 2687776 ']' 00:05:38.999 17:17:50 -- common/autotest_common.sh@952 -- # kill -0 2687776 00:05:38.999 17:17:50 -- common/autotest_common.sh@953 -- # uname 00:05:38.999 17:17:50 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:38.999 17:17:50 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2687776 00:05:38.999 17:17:50 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:38.999 17:17:50 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:38.999 17:17:50 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2687776' 00:05:38.999 killing process with pid 2687776 00:05:38.999 17:17:50 -- common/autotest_common.sh@967 -- # kill 2687776 00:05:38.999 17:17:50 -- common/autotest_common.sh@972 -- # wait 2687776 00:05:41.624 17:17:52 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:41.624 17:17:52 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:41.624 17:17:52 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:41.624 17:17:52 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:41.624 17:17:52 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:42.195 Restarting all devices. 00:05:45.489 lstat() error: No such file or directory 00:05:45.489 QAT Error: No GENERAL section found 00:05:45.489 Failed to configure qat_dev0 00:05:45.489 lstat() error: No such file or directory 00:05:45.489 QAT Error: No GENERAL section found 00:05:45.489 Failed to configure qat_dev1 00:05:45.489 lstat() error: No such file or directory 00:05:45.489 QAT Error: No GENERAL section found 00:05:45.489 Failed to configure qat_dev2 00:05:45.489 enable sriov 00:05:45.748 Checking status of all devices. 00:05:45.748 There is 3 QAT acceleration device(s) in the system: 00:05:45.749 qat_dev0 - type: c6xx, inst_id: 0, node_id: 1, bsf: 0000:cc:00.0, #accel: 5 #engines: 10 state: down 00:05:45.749 qat_dev1 - type: c6xx, inst_id: 1, node_id: 1, bsf: 0000:ce:00.0, #accel: 5 #engines: 10 state: down 00:05:45.749 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:d0:00.0, #accel: 5 #engines: 10 state: down 00:05:46.007 0000:cc:00.0 set to 16 VFs 00:05:46.575 0000:ce:00.0 set to 16 VFs 00:05:47.144 0000:d0:00.0 set to 16 VFs 00:05:47.403 Properly configured the qat device with driver uio_pci_generic. 00:05:47.403 17:17:58 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:47.403 17:17:58 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:47.403 17:17:58 -- common/autotest_common.sh@10 -- # set +x 00:05:47.403 17:17:58 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:47.403 17:17:58 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:47.403 17:17:58 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:47.403 17:17:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.403 17:17:58 -- common/autotest_common.sh@10 -- # set +x 00:05:47.403 ************************************ 00:05:47.403 START TEST env 00:05:47.403 ************************************ 00:05:47.403 17:17:58 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:47.403 * Looking for test storage... 00:05:47.404 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:47.404 17:17:58 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:47.404 17:17:58 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:47.404 17:17:58 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.404 17:17:58 env -- common/autotest_common.sh@10 -- # set +x 00:05:47.404 ************************************ 00:05:47.404 START TEST env_memory 00:05:47.404 ************************************ 00:05:47.404 17:17:58 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:47.404 00:05:47.404 00:05:47.404 CUnit - A unit testing framework for C - Version 2.1-3 00:05:47.404 http://cunit.sourceforge.net/ 00:05:47.404 00:05:47.404 00:05:47.404 Suite: memory 00:05:47.663 Test: alloc and free memory map ...[2024-07-15 17:17:58.724904] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:47.663 passed 00:05:47.663 Test: mem map translation ...[2024-07-15 17:17:58.748647] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:47.663 [2024-07-15 17:17:58.748677] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:47.663 [2024-07-15 17:17:58.748727] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:47.663 [2024-07-15 17:17:58.748735] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:47.663 passed 00:05:47.663 Test: mem map registration ...[2024-07-15 17:17:58.799753] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:47.663 [2024-07-15 17:17:58.799777] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:47.663 passed 00:05:47.663 Test: mem map adjacent registrations ...passed 00:05:47.663 00:05:47.663 Run Summary: Type Total Ran Passed Failed Inactive 00:05:47.663 suites 1 1 n/a 0 0 00:05:47.663 tests 4 4 4 0 0 00:05:47.663 asserts 152 152 152 0 n/a 00:05:47.663 00:05:47.663 Elapsed time = 0.185 seconds 00:05:47.663 00:05:47.663 real 0m0.198s 00:05:47.663 user 0m0.187s 00:05:47.663 sys 0m0.011s 00:05:47.663 17:17:58 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:47.663 17:17:58 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:47.663 ************************************ 00:05:47.663 END TEST env_memory 00:05:47.663 ************************************ 00:05:47.663 17:17:58 env -- common/autotest_common.sh@1142 -- # return 0 00:05:47.663 17:17:58 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:47.663 17:17:58 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:47.663 17:17:58 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.663 17:17:58 env -- common/autotest_common.sh@10 -- # set +x 00:05:47.664 ************************************ 00:05:47.664 START TEST env_vtophys 00:05:47.664 ************************************ 00:05:47.664 17:17:58 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:47.925 EAL: lib.eal log level changed from notice to debug 00:05:47.925 EAL: Detected lcore 0 as core 0 on socket 0 00:05:47.925 EAL: Detected lcore 1 as core 1 on socket 0 00:05:47.925 EAL: Detected lcore 2 as core 2 on socket 0 00:05:47.925 EAL: Detected lcore 3 as core 3 on socket 0 00:05:47.925 EAL: Detected lcore 4 as core 4 on socket 0 00:05:47.925 EAL: Detected lcore 5 as core 5 on socket 0 00:05:47.925 EAL: Detected lcore 6 as core 6 on socket 0 00:05:47.925 EAL: Detected lcore 7 as core 7 on socket 0 00:05:47.925 EAL: Detected lcore 8 as core 8 on socket 0 00:05:47.925 EAL: Detected lcore 9 as core 9 on socket 0 00:05:47.925 EAL: Detected lcore 10 as core 10 on socket 0 00:05:47.925 EAL: Detected lcore 11 as core 11 on socket 0 00:05:47.925 EAL: Detected lcore 12 as core 12 on socket 0 00:05:47.925 EAL: Detected lcore 13 as core 13 on socket 0 00:05:47.925 EAL: Detected lcore 14 as core 14 on socket 0 00:05:47.925 EAL: Detected lcore 15 as core 15 on socket 0 00:05:47.925 EAL: Detected lcore 16 as core 16 on socket 0 00:05:47.925 EAL: Detected lcore 17 as core 17 on socket 0 00:05:47.925 EAL: Detected lcore 18 as core 18 on socket 0 00:05:47.925 EAL: Detected lcore 19 as core 19 on socket 0 00:05:47.925 EAL: Detected lcore 20 as core 20 on socket 0 00:05:47.925 EAL: Detected lcore 21 as core 21 on socket 0 00:05:47.925 EAL: Detected lcore 22 as core 22 on socket 0 00:05:47.925 EAL: Detected lcore 23 as core 23 on socket 0 00:05:47.925 EAL: Detected lcore 24 as core 24 on socket 0 00:05:47.925 EAL: Detected lcore 25 as core 25 on socket 0 00:05:47.925 EAL: Detected lcore 26 as core 26 on socket 0 00:05:47.925 EAL: Detected lcore 27 as core 27 on socket 0 00:05:47.925 EAL: Detected lcore 28 as core 28 on socket 0 00:05:47.925 EAL: Detected lcore 29 as core 29 on socket 0 00:05:47.925 EAL: Detected lcore 30 as core 30 on socket 0 00:05:47.925 EAL: Detected lcore 31 as core 31 on socket 0 00:05:47.925 EAL: Detected lcore 32 as core 0 on socket 1 00:05:47.925 EAL: Detected lcore 33 as core 1 on socket 1 00:05:47.925 EAL: Detected lcore 34 as core 2 on socket 1 00:05:47.925 EAL: Detected lcore 35 as core 3 on socket 1 00:05:47.925 EAL: Detected lcore 36 as core 4 on socket 1 00:05:47.925 EAL: Detected lcore 37 as core 5 on socket 1 00:05:47.925 EAL: Detected lcore 38 as core 6 on socket 1 00:05:47.925 EAL: Detected lcore 39 as core 7 on socket 1 00:05:47.925 EAL: Detected lcore 40 as core 8 on socket 1 00:05:47.925 EAL: Detected lcore 41 as core 9 on socket 1 00:05:47.925 EAL: Detected lcore 42 as core 10 on socket 1 00:05:47.925 EAL: Detected lcore 43 as core 11 on socket 1 00:05:47.925 EAL: Detected lcore 44 as core 12 on socket 1 00:05:47.925 EAL: Detected lcore 45 as core 13 on socket 1 00:05:47.925 EAL: Detected lcore 46 as core 14 on socket 1 00:05:47.925 EAL: Detected lcore 47 as core 15 on socket 1 00:05:47.925 EAL: Detected lcore 48 as core 16 on socket 1 00:05:47.925 EAL: Detected lcore 49 as core 17 on socket 1 00:05:47.925 EAL: Detected lcore 50 as core 18 on socket 1 00:05:47.925 EAL: Detected lcore 51 as core 19 on socket 1 00:05:47.925 EAL: Detected lcore 52 as core 20 on socket 1 00:05:47.925 EAL: Detected lcore 53 as core 21 on socket 1 00:05:47.925 EAL: Detected lcore 54 as core 22 on socket 1 00:05:47.925 EAL: Detected lcore 55 as core 23 on socket 1 00:05:47.925 EAL: Detected lcore 56 as core 24 on socket 1 00:05:47.925 EAL: Detected lcore 57 as core 25 on socket 1 00:05:47.925 EAL: Detected lcore 58 as core 26 on socket 1 00:05:47.925 EAL: Detected lcore 59 as core 27 on socket 1 00:05:47.925 EAL: Detected lcore 60 as core 28 on socket 1 00:05:47.925 EAL: Detected lcore 61 as core 29 on socket 1 00:05:47.925 EAL: Detected lcore 62 as core 30 on socket 1 00:05:47.925 EAL: Detected lcore 63 as core 31 on socket 1 00:05:47.925 EAL: Detected lcore 64 as core 0 on socket 0 00:05:47.925 EAL: Detected lcore 65 as core 1 on socket 0 00:05:47.925 EAL: Detected lcore 66 as core 2 on socket 0 00:05:47.925 EAL: Detected lcore 67 as core 3 on socket 0 00:05:47.925 EAL: Detected lcore 68 as core 4 on socket 0 00:05:47.925 EAL: Detected lcore 69 as core 5 on socket 0 00:05:47.925 EAL: Detected lcore 70 as core 6 on socket 0 00:05:47.925 EAL: Detected lcore 71 as core 7 on socket 0 00:05:47.925 EAL: Detected lcore 72 as core 8 on socket 0 00:05:47.925 EAL: Detected lcore 73 as core 9 on socket 0 00:05:47.925 EAL: Detected lcore 74 as core 10 on socket 0 00:05:47.926 EAL: Detected lcore 75 as core 11 on socket 0 00:05:47.926 EAL: Detected lcore 76 as core 12 on socket 0 00:05:47.926 EAL: Detected lcore 77 as core 13 on socket 0 00:05:47.926 EAL: Detected lcore 78 as core 14 on socket 0 00:05:47.926 EAL: Detected lcore 79 as core 15 on socket 0 00:05:47.926 EAL: Detected lcore 80 as core 16 on socket 0 00:05:47.926 EAL: Detected lcore 81 as core 17 on socket 0 00:05:47.926 EAL: Detected lcore 82 as core 18 on socket 0 00:05:47.926 EAL: Detected lcore 83 as core 19 on socket 0 00:05:47.926 EAL: Detected lcore 84 as core 20 on socket 0 00:05:47.926 EAL: Detected lcore 85 as core 21 on socket 0 00:05:47.926 EAL: Detected lcore 86 as core 22 on socket 0 00:05:47.926 EAL: Detected lcore 87 as core 23 on socket 0 00:05:47.926 EAL: Detected lcore 88 as core 24 on socket 0 00:05:47.926 EAL: Detected lcore 89 as core 25 on socket 0 00:05:47.926 EAL: Detected lcore 90 as core 26 on socket 0 00:05:47.926 EAL: Detected lcore 91 as core 27 on socket 0 00:05:47.926 EAL: Detected lcore 92 as core 28 on socket 0 00:05:47.926 EAL: Detected lcore 93 as core 29 on socket 0 00:05:47.926 EAL: Detected lcore 94 as core 30 on socket 0 00:05:47.926 EAL: Detected lcore 95 as core 31 on socket 0 00:05:47.926 EAL: Detected lcore 96 as core 0 on socket 1 00:05:47.926 EAL: Detected lcore 97 as core 1 on socket 1 00:05:47.926 EAL: Detected lcore 98 as core 2 on socket 1 00:05:47.926 EAL: Detected lcore 99 as core 3 on socket 1 00:05:47.926 EAL: Detected lcore 100 as core 4 on socket 1 00:05:47.926 EAL: Detected lcore 101 as core 5 on socket 1 00:05:47.926 EAL: Detected lcore 102 as core 6 on socket 1 00:05:47.926 EAL: Detected lcore 103 as core 7 on socket 1 00:05:47.926 EAL: Detected lcore 104 as core 8 on socket 1 00:05:47.926 EAL: Detected lcore 105 as core 9 on socket 1 00:05:47.926 EAL: Detected lcore 106 as core 10 on socket 1 00:05:47.926 EAL: Detected lcore 107 as core 11 on socket 1 00:05:47.926 EAL: Detected lcore 108 as core 12 on socket 1 00:05:47.926 EAL: Detected lcore 109 as core 13 on socket 1 00:05:47.926 EAL: Detected lcore 110 as core 14 on socket 1 00:05:47.926 EAL: Detected lcore 111 as core 15 on socket 1 00:05:47.926 EAL: Detected lcore 112 as core 16 on socket 1 00:05:47.926 EAL: Detected lcore 113 as core 17 on socket 1 00:05:47.926 EAL: Detected lcore 114 as core 18 on socket 1 00:05:47.926 EAL: Detected lcore 115 as core 19 on socket 1 00:05:47.926 EAL: Detected lcore 116 as core 20 on socket 1 00:05:47.926 EAL: Detected lcore 117 as core 21 on socket 1 00:05:47.926 EAL: Detected lcore 118 as core 22 on socket 1 00:05:47.926 EAL: Detected lcore 119 as core 23 on socket 1 00:05:47.926 EAL: Detected lcore 120 as core 24 on socket 1 00:05:47.926 EAL: Detected lcore 121 as core 25 on socket 1 00:05:47.926 EAL: Detected lcore 122 as core 26 on socket 1 00:05:47.926 EAL: Detected lcore 123 as core 27 on socket 1 00:05:47.926 EAL: Detected lcore 124 as core 28 on socket 1 00:05:47.926 EAL: Detected lcore 125 as core 29 on socket 1 00:05:47.926 EAL: Detected lcore 126 as core 30 on socket 1 00:05:47.926 EAL: Detected lcore 127 as core 31 on socket 1 00:05:47.926 EAL: Maximum logical cores by configuration: 128 00:05:47.926 EAL: Detected CPU lcores: 128 00:05:47.926 EAL: Detected NUMA nodes: 2 00:05:47.926 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:47.926 EAL: Detected shared linkage of DPDK 00:05:47.926 EAL: No shared files mode enabled, IPC will be disabled 00:05:47.926 EAL: No shared files mode enabled, IPC is disabled 00:05:47.926 EAL: PCI driver qat for device 0000:cc:01.0 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:01.1 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:01.2 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:01.3 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:01.4 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:01.5 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:01.6 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:01.7 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:02.0 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:02.1 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:02.2 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:02.3 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:02.4 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:02.5 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:02.6 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:cc:02.7 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:01.0 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:01.1 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:01.2 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:01.3 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:01.4 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:01.5 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:01.6 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:01.7 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:02.0 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:02.1 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:02.2 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:02.3 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:02.4 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:02.5 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:02.6 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:ce:02.7 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:01.0 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:01.1 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:01.2 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:01.3 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:01.4 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:01.5 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:01.6 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:01.7 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:02.0 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:02.1 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:02.2 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:02.3 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:02.4 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:02.5 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:02.6 wants IOVA as 'PA' 00:05:47.926 EAL: PCI driver qat for device 0000:d0:02.7 wants IOVA as 'PA' 00:05:47.926 EAL: Bus pci wants IOVA as 'PA' 00:05:47.926 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:47.926 EAL: Bus vdev wants IOVA as 'DC' 00:05:47.926 EAL: Selected IOVA mode 'PA' 00:05:47.926 EAL: Probing VFIO support... 00:05:47.926 EAL: IOMMU type 1 (Type 1) is supported 00:05:47.926 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:47.926 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:47.926 EAL: VFIO support initialized 00:05:47.926 EAL: Ask a virtual area of 0x2e000 bytes 00:05:47.926 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:47.926 EAL: Setting up physically contiguous memory... 00:05:47.926 EAL: Setting maximum number of open files to 524288 00:05:47.926 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:47.926 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:47.926 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:47.926 EAL: Ask a virtual area of 0x61000 bytes 00:05:47.926 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:47.926 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:47.926 EAL: Ask a virtual area of 0x400000000 bytes 00:05:47.926 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:47.926 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:47.926 EAL: Ask a virtual area of 0x61000 bytes 00:05:47.926 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:47.926 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:47.926 EAL: Ask a virtual area of 0x400000000 bytes 00:05:47.926 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:47.926 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:47.926 EAL: Ask a virtual area of 0x61000 bytes 00:05:47.926 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:47.926 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:47.926 EAL: Ask a virtual area of 0x400000000 bytes 00:05:47.926 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:47.926 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:47.926 EAL: Ask a virtual area of 0x61000 bytes 00:05:47.926 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:47.926 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:47.926 EAL: Ask a virtual area of 0x400000000 bytes 00:05:47.926 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:47.926 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:47.926 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:47.926 EAL: Ask a virtual area of 0x61000 bytes 00:05:47.926 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:47.926 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:47.926 EAL: Ask a virtual area of 0x400000000 bytes 00:05:47.926 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:47.926 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:47.926 EAL: Ask a virtual area of 0x61000 bytes 00:05:47.926 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:47.926 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:47.926 EAL: Ask a virtual area of 0x400000000 bytes 00:05:47.926 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:47.926 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:47.926 EAL: Ask a virtual area of 0x61000 bytes 00:05:47.926 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:47.926 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:47.926 EAL: Ask a virtual area of 0x400000000 bytes 00:05:47.926 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:47.926 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:47.926 EAL: Ask a virtual area of 0x61000 bytes 00:05:47.926 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:47.926 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:47.926 EAL: Ask a virtual area of 0x400000000 bytes 00:05:47.926 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:47.926 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:47.926 EAL: Hugepages will be freed exactly as allocated. 00:05:47.926 EAL: No shared files mode enabled, IPC is disabled 00:05:47.926 EAL: No shared files mode enabled, IPC is disabled 00:05:47.926 EAL: TSC frequency is ~2600000 KHz 00:05:47.926 EAL: Main lcore 0 is ready (tid=7f5387307b00;cpuset=[0]) 00:05:47.926 EAL: Trying to obtain current memory policy. 00:05:47.926 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:47.926 EAL: Restoring previous memory policy: 0 00:05:47.926 EAL: request: mp_malloc_sync 00:05:47.926 EAL: No shared files mode enabled, IPC is disabled 00:05:47.926 EAL: Heap on socket 0 was expanded by 2MB 00:05:47.926 EAL: PCI device 0000:cc:01.0 on NUMA socket 1 00:05:47.926 EAL: probe driver: 8086:37c9 qat 00:05:47.926 EAL: PCI memory mapped at 0x202001000000 00:05:47.927 EAL: PCI memory mapped at 0x202001001000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:05:47.927 EAL: Trying to obtain current memory policy. 00:05:47.927 EAL: Setting policy MPOL_PREFERRED for socket 1 00:05:47.927 EAL: Restoring previous memory policy: 4 00:05:47.927 EAL: request: mp_malloc_sync 00:05:47.927 EAL: No shared files mode enabled, IPC is disabled 00:05:47.927 EAL: Heap on socket 1 was expanded by 2MB 00:05:47.927 EAL: PCI device 0000:cc:01.1 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001002000 00:05:47.927 EAL: PCI memory mapped at 0x202001003000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:01.2 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001004000 00:05:47.927 EAL: PCI memory mapped at 0x202001005000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:01.3 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001006000 00:05:47.927 EAL: PCI memory mapped at 0x202001007000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:01.4 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001008000 00:05:47.927 EAL: PCI memory mapped at 0x202001009000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:01.5 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200100a000 00:05:47.927 EAL: PCI memory mapped at 0x20200100b000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:01.6 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200100c000 00:05:47.927 EAL: PCI memory mapped at 0x20200100d000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:01.7 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200100e000 00:05:47.927 EAL: PCI memory mapped at 0x20200100f000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:02.0 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001010000 00:05:47.927 EAL: PCI memory mapped at 0x202001011000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:02.1 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001012000 00:05:47.927 EAL: PCI memory mapped at 0x202001013000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:02.2 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001014000 00:05:47.927 EAL: PCI memory mapped at 0x202001015000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:02.3 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001016000 00:05:47.927 EAL: PCI memory mapped at 0x202001017000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:02.4 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001018000 00:05:47.927 EAL: PCI memory mapped at 0x202001019000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:02.5 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200101a000 00:05:47.927 EAL: PCI memory mapped at 0x20200101b000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:02.6 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200101c000 00:05:47.927 EAL: PCI memory mapped at 0x20200101d000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:05:47.927 EAL: PCI device 0000:cc:02.7 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200101e000 00:05:47.927 EAL: PCI memory mapped at 0x20200101f000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:01.0 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001020000 00:05:47.927 EAL: PCI memory mapped at 0x202001021000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:01.1 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001022000 00:05:47.927 EAL: PCI memory mapped at 0x202001023000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:01.2 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001024000 00:05:47.927 EAL: PCI memory mapped at 0x202001025000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:01.3 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001026000 00:05:47.927 EAL: PCI memory mapped at 0x202001027000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:01.4 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001028000 00:05:47.927 EAL: PCI memory mapped at 0x202001029000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:01.5 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200102a000 00:05:47.927 EAL: PCI memory mapped at 0x20200102b000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:01.6 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200102c000 00:05:47.927 EAL: PCI memory mapped at 0x20200102d000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:01.7 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200102e000 00:05:47.927 EAL: PCI memory mapped at 0x20200102f000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:02.0 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001030000 00:05:47.927 EAL: PCI memory mapped at 0x202001031000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:02.1 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001032000 00:05:47.927 EAL: PCI memory mapped at 0x202001033000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:02.2 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001034000 00:05:47.927 EAL: PCI memory mapped at 0x202001035000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:02.3 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001036000 00:05:47.927 EAL: PCI memory mapped at 0x202001037000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:02.4 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001038000 00:05:47.927 EAL: PCI memory mapped at 0x202001039000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:02.5 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200103a000 00:05:47.927 EAL: PCI memory mapped at 0x20200103b000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:02.6 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200103c000 00:05:47.927 EAL: PCI memory mapped at 0x20200103d000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:05:47.927 EAL: PCI device 0000:ce:02.7 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x20200103e000 00:05:47.927 EAL: PCI memory mapped at 0x20200103f000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:05:47.927 EAL: PCI device 0000:d0:01.0 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001040000 00:05:47.927 EAL: PCI memory mapped at 0x202001041000 00:05:47.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:05:47.927 EAL: PCI device 0000:d0:01.1 on NUMA socket 1 00:05:47.927 EAL: probe driver: 8086:37c9 qat 00:05:47.927 EAL: PCI memory mapped at 0x202001042000 00:05:47.927 EAL: PCI memory mapped at 0x202001043000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:01.2 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x202001044000 00:05:47.928 EAL: PCI memory mapped at 0x202001045000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:01.3 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x202001046000 00:05:47.928 EAL: PCI memory mapped at 0x202001047000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:01.4 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x202001048000 00:05:47.928 EAL: PCI memory mapped at 0x202001049000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:01.5 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x20200104a000 00:05:47.928 EAL: PCI memory mapped at 0x20200104b000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:01.6 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x20200104c000 00:05:47.928 EAL: PCI memory mapped at 0x20200104d000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:01.7 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x20200104e000 00:05:47.928 EAL: PCI memory mapped at 0x20200104f000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:02.0 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x202001050000 00:05:47.928 EAL: PCI memory mapped at 0x202001051000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:02.1 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x202001052000 00:05:47.928 EAL: PCI memory mapped at 0x202001053000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:02.2 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x202001054000 00:05:47.928 EAL: PCI memory mapped at 0x202001055000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:02.3 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x202001056000 00:05:47.928 EAL: PCI memory mapped at 0x202001057000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:02.4 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x202001058000 00:05:47.928 EAL: PCI memory mapped at 0x202001059000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:02.5 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x20200105a000 00:05:47.928 EAL: PCI memory mapped at 0x20200105b000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:02.6 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x20200105c000 00:05:47.928 EAL: PCI memory mapped at 0x20200105d000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:05:47.928 EAL: PCI device 0000:d0:02.7 on NUMA socket 1 00:05:47.928 EAL: probe driver: 8086:37c9 qat 00:05:47.928 EAL: PCI memory mapped at 0x20200105e000 00:05:47.928 EAL: PCI memory mapped at 0x20200105f000 00:05:47.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:47.928 EAL: Mem event callback 'spdk:(nil)' registered 00:05:47.928 00:05:47.928 00:05:47.928 CUnit - A unit testing framework for C - Version 2.1-3 00:05:47.928 http://cunit.sourceforge.net/ 00:05:47.928 00:05:47.928 00:05:47.928 Suite: components_suite 00:05:47.928 Test: vtophys_malloc_test ...passed 00:05:47.928 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:47.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:47.928 EAL: Restoring previous memory policy: 4 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was expanded by 4MB 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was shrunk by 4MB 00:05:47.928 EAL: Trying to obtain current memory policy. 00:05:47.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:47.928 EAL: Restoring previous memory policy: 4 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was expanded by 6MB 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was shrunk by 6MB 00:05:47.928 EAL: Trying to obtain current memory policy. 00:05:47.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:47.928 EAL: Restoring previous memory policy: 4 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was expanded by 10MB 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was shrunk by 10MB 00:05:47.928 EAL: Trying to obtain current memory policy. 00:05:47.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:47.928 EAL: Restoring previous memory policy: 4 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was expanded by 18MB 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was shrunk by 18MB 00:05:47.928 EAL: Trying to obtain current memory policy. 00:05:47.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:47.928 EAL: Restoring previous memory policy: 4 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was expanded by 34MB 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was shrunk by 34MB 00:05:47.928 EAL: Trying to obtain current memory policy. 00:05:47.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:47.928 EAL: Restoring previous memory policy: 4 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was expanded by 66MB 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was shrunk by 66MB 00:05:47.928 EAL: Trying to obtain current memory policy. 00:05:47.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:47.928 EAL: Restoring previous memory policy: 4 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was expanded by 130MB 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was shrunk by 130MB 00:05:47.928 EAL: Trying to obtain current memory policy. 00:05:47.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:47.928 EAL: Restoring previous memory policy: 4 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:47.928 EAL: request: mp_malloc_sync 00:05:47.928 EAL: No shared files mode enabled, IPC is disabled 00:05:47.928 EAL: Heap on socket 0 was expanded by 258MB 00:05:47.928 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.188 EAL: request: mp_malloc_sync 00:05:48.188 EAL: No shared files mode enabled, IPC is disabled 00:05:48.188 EAL: Heap on socket 0 was shrunk by 258MB 00:05:48.188 EAL: Trying to obtain current memory policy. 00:05:48.188 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.188 EAL: Restoring previous memory policy: 4 00:05:48.188 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.188 EAL: request: mp_malloc_sync 00:05:48.188 EAL: No shared files mode enabled, IPC is disabled 00:05:48.188 EAL: Heap on socket 0 was expanded by 514MB 00:05:48.188 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.188 EAL: request: mp_malloc_sync 00:05:48.188 EAL: No shared files mode enabled, IPC is disabled 00:05:48.188 EAL: Heap on socket 0 was shrunk by 514MB 00:05:48.188 EAL: Trying to obtain current memory policy. 00:05:48.188 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.447 EAL: Restoring previous memory policy: 4 00:05:48.447 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.447 EAL: request: mp_malloc_sync 00:05:48.447 EAL: No shared files mode enabled, IPC is disabled 00:05:48.447 EAL: Heap on socket 0 was expanded by 1026MB 00:05:48.447 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.708 EAL: request: mp_malloc_sync 00:05:48.708 EAL: No shared files mode enabled, IPC is disabled 00:05:48.708 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:48.708 passed 00:05:48.708 00:05:48.708 Run Summary: Type Total Ran Passed Failed Inactive 00:05:48.708 suites 1 1 n/a 0 0 00:05:48.708 tests 2 2 2 0 0 00:05:48.708 asserts 6443 6443 6443 0 n/a 00:05:48.708 00:05:48.708 Elapsed time = 0.676 seconds 00:05:48.708 EAL: No shared files mode enabled, IPC is disabled 00:05:48.708 EAL: No shared files mode enabled, IPC is disabled 00:05:48.708 EAL: No shared files mode enabled, IPC is disabled 00:05:48.708 00:05:48.708 real 0m0.818s 00:05:48.708 user 0m0.427s 00:05:48.708 sys 0m0.359s 00:05:48.708 17:17:59 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:48.708 17:17:59 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:48.708 ************************************ 00:05:48.708 END TEST env_vtophys 00:05:48.708 ************************************ 00:05:48.708 17:17:59 env -- common/autotest_common.sh@1142 -- # return 0 00:05:48.708 17:17:59 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:48.708 17:17:59 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:48.708 17:17:59 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.708 17:17:59 env -- common/autotest_common.sh@10 -- # set +x 00:05:48.708 ************************************ 00:05:48.708 START TEST env_pci 00:05:48.708 ************************************ 00:05:48.708 17:17:59 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:48.708 00:05:48.708 00:05:48.708 CUnit - A unit testing framework for C - Version 2.1-3 00:05:48.708 http://cunit.sourceforge.net/ 00:05:48.708 00:05:48.708 00:05:48.708 Suite: pci 00:05:48.708 Test: pci_hook ...[2024-07-15 17:17:59.870818] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2690407 has claimed it 00:05:48.708 EAL: Cannot find device (10000:00:01.0) 00:05:48.708 EAL: Failed to attach device on primary process 00:05:48.708 passed 00:05:48.708 00:05:48.708 Run Summary: Type Total Ran Passed Failed Inactive 00:05:48.708 suites 1 1 n/a 0 0 00:05:48.708 tests 1 1 1 0 0 00:05:48.708 asserts 25 25 25 0 n/a 00:05:48.708 00:05:48.708 Elapsed time = 0.040 seconds 00:05:48.708 00:05:48.708 real 0m0.066s 00:05:48.708 user 0m0.027s 00:05:48.708 sys 0m0.039s 00:05:48.708 17:17:59 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:48.708 17:17:59 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:48.708 ************************************ 00:05:48.708 END TEST env_pci 00:05:48.708 ************************************ 00:05:48.708 17:17:59 env -- common/autotest_common.sh@1142 -- # return 0 00:05:48.708 17:17:59 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:48.708 17:17:59 env -- env/env.sh@15 -- # uname 00:05:48.708 17:17:59 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:48.708 17:17:59 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:48.708 17:17:59 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:48.708 17:17:59 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:05:48.708 17:17:59 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.708 17:17:59 env -- common/autotest_common.sh@10 -- # set +x 00:05:48.708 ************************************ 00:05:48.708 START TEST env_dpdk_post_init 00:05:48.708 ************************************ 00:05:48.708 17:17:59 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:48.988 EAL: Detected CPU lcores: 128 00:05:48.988 EAL: Detected NUMA nodes: 2 00:05:48.988 EAL: Detected shared linkage of DPDK 00:05:48.988 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:48.988 EAL: Selected IOVA mode 'PA' 00:05:48.988 EAL: VFIO support initialized 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.988 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:05:48.988 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.989 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:05:48.989 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.989 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.990 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:05:48.990 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.990 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:48.990 EAL: Using IOMMU type 1 (Type 1) 00:05:49.251 EAL: Ignore mapping IO port bar(1) 00:05:49.251 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.0 (socket 0) 00:05:49.251 EAL: Ignore mapping IO port bar(1) 00:05:49.512 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.1 (socket 0) 00:05:49.512 EAL: Ignore mapping IO port bar(1) 00:05:49.773 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.2 (socket 0) 00:05:49.773 EAL: Ignore mapping IO port bar(1) 00:05:49.773 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.3 (socket 0) 00:05:50.033 EAL: Ignore mapping IO port bar(1) 00:05:50.033 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.4 (socket 0) 00:05:50.293 EAL: Ignore mapping IO port bar(1) 00:05:50.293 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.5 (socket 0) 00:05:50.553 EAL: Ignore mapping IO port bar(1) 00:05:50.553 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.6 (socket 0) 00:05:50.814 EAL: Ignore mapping IO port bar(1) 00:05:50.814 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.7 (socket 0) 00:05:51.755 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:65:00.0 (socket 0) 00:05:51.755 EAL: Ignore mapping IO port bar(1) 00:05:51.755 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.0 (socket 1) 00:05:51.755 EAL: Ignore mapping IO port bar(1) 00:05:52.018 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.1 (socket 1) 00:05:52.018 EAL: Ignore mapping IO port bar(1) 00:05:52.279 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.2 (socket 1) 00:05:52.279 EAL: Ignore mapping IO port bar(1) 00:05:52.539 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.3 (socket 1) 00:05:52.539 EAL: Ignore mapping IO port bar(1) 00:05:52.539 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.4 (socket 1) 00:05:52.800 EAL: Ignore mapping IO port bar(1) 00:05:52.800 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.5 (socket 1) 00:05:53.060 EAL: Ignore mapping IO port bar(1) 00:05:53.060 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.6 (socket 1) 00:05:53.321 EAL: Ignore mapping IO port bar(1) 00:05:53.321 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.7 (socket 1) 00:05:57.530 EAL: Releasing PCI mapped resource for 0000:65:00.0 00:05:57.530 EAL: Calling pci_unmap_resource for 0000:65:00.0 at 0x202001080000 00:05:57.530 Starting DPDK initialization... 00:05:57.530 Starting SPDK post initialization... 00:05:57.530 SPDK NVMe probe 00:05:57.530 Attaching to 0000:65:00.0 00:05:57.530 Attached to 0000:65:00.0 00:05:57.530 Cleaning up... 00:05:59.486 00:05:59.486 real 0m10.420s 00:05:59.486 user 0m4.260s 00:05:59.486 sys 0m0.178s 00:05:59.486 17:18:10 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:59.486 17:18:10 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:59.486 ************************************ 00:05:59.486 END TEST env_dpdk_post_init 00:05:59.486 ************************************ 00:05:59.486 17:18:10 env -- common/autotest_common.sh@1142 -- # return 0 00:05:59.486 17:18:10 env -- env/env.sh@26 -- # uname 00:05:59.486 17:18:10 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:59.486 17:18:10 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:59.486 17:18:10 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:59.486 17:18:10 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.486 17:18:10 env -- common/autotest_common.sh@10 -- # set +x 00:05:59.486 ************************************ 00:05:59.486 START TEST env_mem_callbacks 00:05:59.486 ************************************ 00:05:59.486 17:18:10 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:59.486 EAL: Detected CPU lcores: 128 00:05:59.486 EAL: Detected NUMA nodes: 2 00:05:59.486 EAL: Detected shared linkage of DPDK 00:05:59.486 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:59.486 EAL: Selected IOVA mode 'PA' 00:05:59.486 EAL: VFIO support initialized 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:05:59.486 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.486 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:59.487 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:05:59.487 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:59.487 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:59.487 00:05:59.487 00:05:59.487 CUnit - A unit testing framework for C - Version 2.1-3 00:05:59.487 http://cunit.sourceforge.net/ 00:05:59.487 00:05:59.487 00:05:59.487 Suite: memory 00:05:59.487 Test: test ... 00:05:59.487 register 0x200000200000 2097152 00:05:59.487 register 0x201000a00000 2097152 00:05:59.487 malloc 3145728 00:05:59.487 register 0x200000400000 4194304 00:05:59.487 buf 0x200000500000 len 3145728 PASSED 00:05:59.487 malloc 64 00:05:59.487 buf 0x2000004fff40 len 64 PASSED 00:05:59.487 malloc 4194304 00:05:59.487 register 0x200000800000 6291456 00:05:59.487 buf 0x200000a00000 len 4194304 PASSED 00:05:59.487 free 0x200000500000 3145728 00:05:59.487 free 0x2000004fff40 64 00:05:59.487 unregister 0x200000400000 4194304 PASSED 00:05:59.487 free 0x200000a00000 4194304 00:05:59.487 unregister 0x200000800000 6291456 PASSED 00:05:59.487 malloc 8388608 00:05:59.488 register 0x200000400000 10485760 00:05:59.488 buf 0x200000600000 len 8388608 PASSED 00:05:59.488 free 0x200000600000 8388608 00:05:59.488 unregister 0x200000400000 10485760 PASSED 00:05:59.488 passed 00:05:59.488 00:05:59.488 Run Summary: Type Total Ran Passed Failed Inactive 00:05:59.488 suites 1 1 n/a 0 0 00:05:59.488 tests 1 1 1 0 0 00:05:59.488 asserts 16 16 16 0 n/a 00:05:59.488 00:05:59.488 Elapsed time = 0.006 seconds 00:05:59.488 00:05:59.488 real 0m0.084s 00:05:59.488 user 0m0.033s 00:05:59.488 sys 0m0.050s 00:05:59.488 17:18:10 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:59.488 17:18:10 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:59.488 ************************************ 00:05:59.488 END TEST env_mem_callbacks 00:05:59.488 ************************************ 00:05:59.488 17:18:10 env -- common/autotest_common.sh@1142 -- # return 0 00:05:59.488 00:05:59.488 real 0m12.094s 00:05:59.488 user 0m5.120s 00:05:59.488 sys 0m0.988s 00:05:59.488 17:18:10 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:59.488 17:18:10 env -- common/autotest_common.sh@10 -- # set +x 00:05:59.488 ************************************ 00:05:59.488 END TEST env 00:05:59.488 ************************************ 00:05:59.488 17:18:10 -- common/autotest_common.sh@1142 -- # return 0 00:05:59.488 17:18:10 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:59.488 17:18:10 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:59.488 17:18:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.488 17:18:10 -- common/autotest_common.sh@10 -- # set +x 00:05:59.488 ************************************ 00:05:59.488 START TEST rpc 00:05:59.488 ************************************ 00:05:59.488 17:18:10 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:59.749 * Looking for test storage... 00:05:59.749 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:59.749 17:18:10 rpc -- rpc/rpc.sh@65 -- # spdk_pid=2692807 00:05:59.749 17:18:10 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:59.749 17:18:10 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:59.749 17:18:10 rpc -- rpc/rpc.sh@67 -- # waitforlisten 2692807 00:05:59.749 17:18:10 rpc -- common/autotest_common.sh@829 -- # '[' -z 2692807 ']' 00:05:59.749 17:18:10 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:59.749 17:18:10 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:59.749 17:18:10 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:59.749 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:59.749 17:18:10 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:59.749 17:18:10 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:59.749 [2024-07-15 17:18:10.879841] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:05:59.749 [2024-07-15 17:18:10.879900] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2692807 ] 00:05:59.749 [2024-07-15 17:18:10.972448] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:59.749 [2024-07-15 17:18:11.039821] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:59.749 [2024-07-15 17:18:11.039864] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2692807' to capture a snapshot of events at runtime. 00:05:59.749 [2024-07-15 17:18:11.039871] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:59.749 [2024-07-15 17:18:11.039877] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:59.749 [2024-07-15 17:18:11.039883] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2692807 for offline analysis/debug. 00:05:59.749 [2024-07-15 17:18:11.039909] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.692 17:18:11 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:00.692 17:18:11 rpc -- common/autotest_common.sh@862 -- # return 0 00:06:00.692 17:18:11 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:00.692 17:18:11 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:00.692 17:18:11 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:00.692 17:18:11 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:00.692 17:18:11 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:00.692 17:18:11 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:00.692 17:18:11 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:00.692 ************************************ 00:06:00.692 START TEST rpc_integrity 00:06:00.692 ************************************ 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:00.692 { 00:06:00.692 "name": "Malloc0", 00:06:00.692 "aliases": [ 00:06:00.692 "5c1dcaad-cd09-4131-974c-982e5a0a04c4" 00:06:00.692 ], 00:06:00.692 "product_name": "Malloc disk", 00:06:00.692 "block_size": 512, 00:06:00.692 "num_blocks": 16384, 00:06:00.692 "uuid": "5c1dcaad-cd09-4131-974c-982e5a0a04c4", 00:06:00.692 "assigned_rate_limits": { 00:06:00.692 "rw_ios_per_sec": 0, 00:06:00.692 "rw_mbytes_per_sec": 0, 00:06:00.692 "r_mbytes_per_sec": 0, 00:06:00.692 "w_mbytes_per_sec": 0 00:06:00.692 }, 00:06:00.692 "claimed": false, 00:06:00.692 "zoned": false, 00:06:00.692 "supported_io_types": { 00:06:00.692 "read": true, 00:06:00.692 "write": true, 00:06:00.692 "unmap": true, 00:06:00.692 "flush": true, 00:06:00.692 "reset": true, 00:06:00.692 "nvme_admin": false, 00:06:00.692 "nvme_io": false, 00:06:00.692 "nvme_io_md": false, 00:06:00.692 "write_zeroes": true, 00:06:00.692 "zcopy": true, 00:06:00.692 "get_zone_info": false, 00:06:00.692 "zone_management": false, 00:06:00.692 "zone_append": false, 00:06:00.692 "compare": false, 00:06:00.692 "compare_and_write": false, 00:06:00.692 "abort": true, 00:06:00.692 "seek_hole": false, 00:06:00.692 "seek_data": false, 00:06:00.692 "copy": true, 00:06:00.692 "nvme_iov_md": false 00:06:00.692 }, 00:06:00.692 "memory_domains": [ 00:06:00.692 { 00:06:00.692 "dma_device_id": "system", 00:06:00.692 "dma_device_type": 1 00:06:00.692 }, 00:06:00.692 { 00:06:00.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:00.692 "dma_device_type": 2 00:06:00.692 } 00:06:00.692 ], 00:06:00.692 "driver_specific": {} 00:06:00.692 } 00:06:00.692 ]' 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.692 [2024-07-15 17:18:11.896972] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:00.692 [2024-07-15 17:18:11.897002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:00.692 [2024-07-15 17:18:11.897013] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc53e00 00:06:00.692 [2024-07-15 17:18:11.897023] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:00.692 [2024-07-15 17:18:11.898297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:00.692 [2024-07-15 17:18:11.898317] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:00.692 Passthru0 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.692 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.692 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:00.692 { 00:06:00.692 "name": "Malloc0", 00:06:00.692 "aliases": [ 00:06:00.692 "5c1dcaad-cd09-4131-974c-982e5a0a04c4" 00:06:00.692 ], 00:06:00.692 "product_name": "Malloc disk", 00:06:00.692 "block_size": 512, 00:06:00.692 "num_blocks": 16384, 00:06:00.692 "uuid": "5c1dcaad-cd09-4131-974c-982e5a0a04c4", 00:06:00.692 "assigned_rate_limits": { 00:06:00.692 "rw_ios_per_sec": 0, 00:06:00.692 "rw_mbytes_per_sec": 0, 00:06:00.692 "r_mbytes_per_sec": 0, 00:06:00.692 "w_mbytes_per_sec": 0 00:06:00.692 }, 00:06:00.692 "claimed": true, 00:06:00.692 "claim_type": "exclusive_write", 00:06:00.692 "zoned": false, 00:06:00.692 "supported_io_types": { 00:06:00.692 "read": true, 00:06:00.692 "write": true, 00:06:00.692 "unmap": true, 00:06:00.692 "flush": true, 00:06:00.692 "reset": true, 00:06:00.692 "nvme_admin": false, 00:06:00.692 "nvme_io": false, 00:06:00.692 "nvme_io_md": false, 00:06:00.692 "write_zeroes": true, 00:06:00.692 "zcopy": true, 00:06:00.692 "get_zone_info": false, 00:06:00.692 "zone_management": false, 00:06:00.692 "zone_append": false, 00:06:00.692 "compare": false, 00:06:00.692 "compare_and_write": false, 00:06:00.692 "abort": true, 00:06:00.692 "seek_hole": false, 00:06:00.692 "seek_data": false, 00:06:00.692 "copy": true, 00:06:00.692 "nvme_iov_md": false 00:06:00.692 }, 00:06:00.692 "memory_domains": [ 00:06:00.692 { 00:06:00.692 "dma_device_id": "system", 00:06:00.692 "dma_device_type": 1 00:06:00.692 }, 00:06:00.692 { 00:06:00.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:00.692 "dma_device_type": 2 00:06:00.692 } 00:06:00.692 ], 00:06:00.692 "driver_specific": {} 00:06:00.692 }, 00:06:00.692 { 00:06:00.692 "name": "Passthru0", 00:06:00.692 "aliases": [ 00:06:00.692 "dce81e44-9455-54b7-9f20-d72c9416cdf2" 00:06:00.692 ], 00:06:00.692 "product_name": "passthru", 00:06:00.692 "block_size": 512, 00:06:00.692 "num_blocks": 16384, 00:06:00.692 "uuid": "dce81e44-9455-54b7-9f20-d72c9416cdf2", 00:06:00.692 "assigned_rate_limits": { 00:06:00.693 "rw_ios_per_sec": 0, 00:06:00.693 "rw_mbytes_per_sec": 0, 00:06:00.693 "r_mbytes_per_sec": 0, 00:06:00.693 "w_mbytes_per_sec": 0 00:06:00.693 }, 00:06:00.693 "claimed": false, 00:06:00.693 "zoned": false, 00:06:00.693 "supported_io_types": { 00:06:00.693 "read": true, 00:06:00.693 "write": true, 00:06:00.693 "unmap": true, 00:06:00.693 "flush": true, 00:06:00.693 "reset": true, 00:06:00.693 "nvme_admin": false, 00:06:00.693 "nvme_io": false, 00:06:00.693 "nvme_io_md": false, 00:06:00.693 "write_zeroes": true, 00:06:00.693 "zcopy": true, 00:06:00.693 "get_zone_info": false, 00:06:00.693 "zone_management": false, 00:06:00.693 "zone_append": false, 00:06:00.693 "compare": false, 00:06:00.693 "compare_and_write": false, 00:06:00.693 "abort": true, 00:06:00.693 "seek_hole": false, 00:06:00.693 "seek_data": false, 00:06:00.693 "copy": true, 00:06:00.693 "nvme_iov_md": false 00:06:00.693 }, 00:06:00.693 "memory_domains": [ 00:06:00.693 { 00:06:00.693 "dma_device_id": "system", 00:06:00.693 "dma_device_type": 1 00:06:00.693 }, 00:06:00.693 { 00:06:00.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:00.693 "dma_device_type": 2 00:06:00.693 } 00:06:00.693 ], 00:06:00.693 "driver_specific": { 00:06:00.693 "passthru": { 00:06:00.693 "name": "Passthru0", 00:06:00.693 "base_bdev_name": "Malloc0" 00:06:00.693 } 00:06:00.693 } 00:06:00.693 } 00:06:00.693 ]' 00:06:00.693 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:00.693 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:00.693 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:00.693 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.693 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.693 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.693 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:00.693 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.693 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.693 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.693 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:00.693 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.693 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.954 17:18:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.954 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:00.954 17:18:11 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:00.954 17:18:12 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:00.954 00:06:00.954 real 0m0.293s 00:06:00.954 user 0m0.194s 00:06:00.954 sys 0m0.037s 00:06:00.954 17:18:12 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:00.954 17:18:12 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.954 ************************************ 00:06:00.954 END TEST rpc_integrity 00:06:00.954 ************************************ 00:06:00.954 17:18:12 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:00.954 17:18:12 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:00.954 17:18:12 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:00.954 17:18:12 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:00.954 17:18:12 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:00.954 ************************************ 00:06:00.954 START TEST rpc_plugins 00:06:00.954 ************************************ 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:00.954 { 00:06:00.954 "name": "Malloc1", 00:06:00.954 "aliases": [ 00:06:00.954 "8e6275df-ee74-466d-8d53-be5c6a2aa448" 00:06:00.954 ], 00:06:00.954 "product_name": "Malloc disk", 00:06:00.954 "block_size": 4096, 00:06:00.954 "num_blocks": 256, 00:06:00.954 "uuid": "8e6275df-ee74-466d-8d53-be5c6a2aa448", 00:06:00.954 "assigned_rate_limits": { 00:06:00.954 "rw_ios_per_sec": 0, 00:06:00.954 "rw_mbytes_per_sec": 0, 00:06:00.954 "r_mbytes_per_sec": 0, 00:06:00.954 "w_mbytes_per_sec": 0 00:06:00.954 }, 00:06:00.954 "claimed": false, 00:06:00.954 "zoned": false, 00:06:00.954 "supported_io_types": { 00:06:00.954 "read": true, 00:06:00.954 "write": true, 00:06:00.954 "unmap": true, 00:06:00.954 "flush": true, 00:06:00.954 "reset": true, 00:06:00.954 "nvme_admin": false, 00:06:00.954 "nvme_io": false, 00:06:00.954 "nvme_io_md": false, 00:06:00.954 "write_zeroes": true, 00:06:00.954 "zcopy": true, 00:06:00.954 "get_zone_info": false, 00:06:00.954 "zone_management": false, 00:06:00.954 "zone_append": false, 00:06:00.954 "compare": false, 00:06:00.954 "compare_and_write": false, 00:06:00.954 "abort": true, 00:06:00.954 "seek_hole": false, 00:06:00.954 "seek_data": false, 00:06:00.954 "copy": true, 00:06:00.954 "nvme_iov_md": false 00:06:00.954 }, 00:06:00.954 "memory_domains": [ 00:06:00.954 { 00:06:00.954 "dma_device_id": "system", 00:06:00.954 "dma_device_type": 1 00:06:00.954 }, 00:06:00.954 { 00:06:00.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:00.954 "dma_device_type": 2 00:06:00.954 } 00:06:00.954 ], 00:06:00.954 "driver_specific": {} 00:06:00.954 } 00:06:00.954 ]' 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:00.954 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:00.954 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:01.215 17:18:12 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:01.215 00:06:01.215 real 0m0.155s 00:06:01.215 user 0m0.106s 00:06:01.215 sys 0m0.017s 00:06:01.215 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:01.215 17:18:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:01.215 ************************************ 00:06:01.215 END TEST rpc_plugins 00:06:01.215 ************************************ 00:06:01.215 17:18:12 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:01.215 17:18:12 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:01.215 17:18:12 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:01.215 17:18:12 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.215 17:18:12 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:01.215 ************************************ 00:06:01.215 START TEST rpc_trace_cmd_test 00:06:01.215 ************************************ 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:01.215 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2692807", 00:06:01.215 "tpoint_group_mask": "0x8", 00:06:01.215 "iscsi_conn": { 00:06:01.215 "mask": "0x2", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "scsi": { 00:06:01.215 "mask": "0x4", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "bdev": { 00:06:01.215 "mask": "0x8", 00:06:01.215 "tpoint_mask": "0xffffffffffffffff" 00:06:01.215 }, 00:06:01.215 "nvmf_rdma": { 00:06:01.215 "mask": "0x10", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "nvmf_tcp": { 00:06:01.215 "mask": "0x20", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "ftl": { 00:06:01.215 "mask": "0x40", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "blobfs": { 00:06:01.215 "mask": "0x80", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "dsa": { 00:06:01.215 "mask": "0x200", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "thread": { 00:06:01.215 "mask": "0x400", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "nvme_pcie": { 00:06:01.215 "mask": "0x800", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "iaa": { 00:06:01.215 "mask": "0x1000", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "nvme_tcp": { 00:06:01.215 "mask": "0x2000", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "bdev_nvme": { 00:06:01.215 "mask": "0x4000", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 }, 00:06:01.215 "sock": { 00:06:01.215 "mask": "0x8000", 00:06:01.215 "tpoint_mask": "0x0" 00:06:01.215 } 00:06:01.215 }' 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:01.215 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:01.476 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:01.476 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:01.476 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:01.476 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:01.476 17:18:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:01.476 00:06:01.476 real 0m0.287s 00:06:01.476 user 0m0.256s 00:06:01.476 sys 0m0.021s 00:06:01.476 17:18:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:01.476 17:18:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:01.476 ************************************ 00:06:01.476 END TEST rpc_trace_cmd_test 00:06:01.476 ************************************ 00:06:01.476 17:18:12 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:01.476 17:18:12 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:01.476 17:18:12 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:01.476 17:18:12 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:01.476 17:18:12 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:01.476 17:18:12 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.476 17:18:12 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:01.476 ************************************ 00:06:01.476 START TEST rpc_daemon_integrity 00:06:01.476 ************************************ 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.476 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:01.749 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:01.749 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:01.749 { 00:06:01.749 "name": "Malloc2", 00:06:01.749 "aliases": [ 00:06:01.749 "97eb42e1-8a78-433a-80f7-8e0a45bb077f" 00:06:01.749 ], 00:06:01.749 "product_name": "Malloc disk", 00:06:01.749 "block_size": 512, 00:06:01.749 "num_blocks": 16384, 00:06:01.749 "uuid": "97eb42e1-8a78-433a-80f7-8e0a45bb077f", 00:06:01.749 "assigned_rate_limits": { 00:06:01.749 "rw_ios_per_sec": 0, 00:06:01.749 "rw_mbytes_per_sec": 0, 00:06:01.749 "r_mbytes_per_sec": 0, 00:06:01.749 "w_mbytes_per_sec": 0 00:06:01.749 }, 00:06:01.750 "claimed": false, 00:06:01.750 "zoned": false, 00:06:01.750 "supported_io_types": { 00:06:01.750 "read": true, 00:06:01.750 "write": true, 00:06:01.750 "unmap": true, 00:06:01.750 "flush": true, 00:06:01.750 "reset": true, 00:06:01.750 "nvme_admin": false, 00:06:01.750 "nvme_io": false, 00:06:01.750 "nvme_io_md": false, 00:06:01.750 "write_zeroes": true, 00:06:01.750 "zcopy": true, 00:06:01.750 "get_zone_info": false, 00:06:01.750 "zone_management": false, 00:06:01.750 "zone_append": false, 00:06:01.750 "compare": false, 00:06:01.750 "compare_and_write": false, 00:06:01.750 "abort": true, 00:06:01.750 "seek_hole": false, 00:06:01.750 "seek_data": false, 00:06:01.750 "copy": true, 00:06:01.750 "nvme_iov_md": false 00:06:01.750 }, 00:06:01.750 "memory_domains": [ 00:06:01.750 { 00:06:01.750 "dma_device_id": "system", 00:06:01.750 "dma_device_type": 1 00:06:01.750 }, 00:06:01.750 { 00:06:01.750 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:01.750 "dma_device_type": 2 00:06:01.750 } 00:06:01.750 ], 00:06:01.750 "driver_specific": {} 00:06:01.750 } 00:06:01.750 ]' 00:06:01.750 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:01.750 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:01.750 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:01.750 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.750 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:01.751 [2024-07-15 17:18:12.835506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:01.751 [2024-07-15 17:18:12.835532] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:01.751 [2024-07-15 17:18:12.835544] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf7680 00:06:01.751 [2024-07-15 17:18:12.835550] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:01.751 [2024-07-15 17:18:12.836692] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:01.751 [2024-07-15 17:18:12.836716] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:01.751 Passthru0 00:06:01.751 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:01.751 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:01.751 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.751 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:01.751 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:01.751 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:01.751 { 00:06:01.751 "name": "Malloc2", 00:06:01.751 "aliases": [ 00:06:01.751 "97eb42e1-8a78-433a-80f7-8e0a45bb077f" 00:06:01.751 ], 00:06:01.751 "product_name": "Malloc disk", 00:06:01.751 "block_size": 512, 00:06:01.751 "num_blocks": 16384, 00:06:01.751 "uuid": "97eb42e1-8a78-433a-80f7-8e0a45bb077f", 00:06:01.751 "assigned_rate_limits": { 00:06:01.751 "rw_ios_per_sec": 0, 00:06:01.751 "rw_mbytes_per_sec": 0, 00:06:01.751 "r_mbytes_per_sec": 0, 00:06:01.751 "w_mbytes_per_sec": 0 00:06:01.751 }, 00:06:01.751 "claimed": true, 00:06:01.752 "claim_type": "exclusive_write", 00:06:01.752 "zoned": false, 00:06:01.752 "supported_io_types": { 00:06:01.752 "read": true, 00:06:01.752 "write": true, 00:06:01.752 "unmap": true, 00:06:01.752 "flush": true, 00:06:01.752 "reset": true, 00:06:01.752 "nvme_admin": false, 00:06:01.752 "nvme_io": false, 00:06:01.752 "nvme_io_md": false, 00:06:01.752 "write_zeroes": true, 00:06:01.752 "zcopy": true, 00:06:01.752 "get_zone_info": false, 00:06:01.752 "zone_management": false, 00:06:01.752 "zone_append": false, 00:06:01.752 "compare": false, 00:06:01.752 "compare_and_write": false, 00:06:01.752 "abort": true, 00:06:01.752 "seek_hole": false, 00:06:01.752 "seek_data": false, 00:06:01.752 "copy": true, 00:06:01.752 "nvme_iov_md": false 00:06:01.752 }, 00:06:01.752 "memory_domains": [ 00:06:01.752 { 00:06:01.752 "dma_device_id": "system", 00:06:01.752 "dma_device_type": 1 00:06:01.752 }, 00:06:01.752 { 00:06:01.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:01.753 "dma_device_type": 2 00:06:01.753 } 00:06:01.753 ], 00:06:01.753 "driver_specific": {} 00:06:01.753 }, 00:06:01.753 { 00:06:01.753 "name": "Passthru0", 00:06:01.753 "aliases": [ 00:06:01.753 "91396e95-00b3-5e41-a727-cd2e1c7d7546" 00:06:01.753 ], 00:06:01.753 "product_name": "passthru", 00:06:01.753 "block_size": 512, 00:06:01.753 "num_blocks": 16384, 00:06:01.753 "uuid": "91396e95-00b3-5e41-a727-cd2e1c7d7546", 00:06:01.753 "assigned_rate_limits": { 00:06:01.753 "rw_ios_per_sec": 0, 00:06:01.753 "rw_mbytes_per_sec": 0, 00:06:01.753 "r_mbytes_per_sec": 0, 00:06:01.753 "w_mbytes_per_sec": 0 00:06:01.753 }, 00:06:01.753 "claimed": false, 00:06:01.753 "zoned": false, 00:06:01.753 "supported_io_types": { 00:06:01.753 "read": true, 00:06:01.753 "write": true, 00:06:01.753 "unmap": true, 00:06:01.753 "flush": true, 00:06:01.753 "reset": true, 00:06:01.753 "nvme_admin": false, 00:06:01.753 "nvme_io": false, 00:06:01.753 "nvme_io_md": false, 00:06:01.753 "write_zeroes": true, 00:06:01.753 "zcopy": true, 00:06:01.753 "get_zone_info": false, 00:06:01.753 "zone_management": false, 00:06:01.753 "zone_append": false, 00:06:01.753 "compare": false, 00:06:01.753 "compare_and_write": false, 00:06:01.753 "abort": true, 00:06:01.753 "seek_hole": false, 00:06:01.753 "seek_data": false, 00:06:01.753 "copy": true, 00:06:01.753 "nvme_iov_md": false 00:06:01.753 }, 00:06:01.753 "memory_domains": [ 00:06:01.753 { 00:06:01.753 "dma_device_id": "system", 00:06:01.754 "dma_device_type": 1 00:06:01.754 }, 00:06:01.754 { 00:06:01.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:01.754 "dma_device_type": 2 00:06:01.754 } 00:06:01.754 ], 00:06:01.754 "driver_specific": { 00:06:01.754 "passthru": { 00:06:01.754 "name": "Passthru0", 00:06:01.754 "base_bdev_name": "Malloc2" 00:06:01.754 } 00:06:01.754 } 00:06:01.754 } 00:06:01.754 ]' 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:01.754 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:01.755 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.755 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:01.755 17:18:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:01.755 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:01.755 17:18:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:01.755 17:18:13 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:01.755 00:06:01.755 real 0m0.302s 00:06:01.755 user 0m0.192s 00:06:01.755 sys 0m0.044s 00:06:01.755 17:18:13 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:01.755 17:18:13 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:01.755 ************************************ 00:06:01.755 END TEST rpc_daemon_integrity 00:06:01.755 ************************************ 00:06:01.755 17:18:13 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:01.755 17:18:13 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:01.755 17:18:13 rpc -- rpc/rpc.sh@84 -- # killprocess 2692807 00:06:01.755 17:18:13 rpc -- common/autotest_common.sh@948 -- # '[' -z 2692807 ']' 00:06:01.755 17:18:13 rpc -- common/autotest_common.sh@952 -- # kill -0 2692807 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@953 -- # uname 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2692807 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2692807' 00:06:02.023 killing process with pid 2692807 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@967 -- # kill 2692807 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@972 -- # wait 2692807 00:06:02.023 00:06:02.023 real 0m2.583s 00:06:02.023 user 0m3.456s 00:06:02.023 sys 0m0.721s 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:02.023 17:18:13 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:02.023 ************************************ 00:06:02.023 END TEST rpc 00:06:02.023 ************************************ 00:06:02.283 17:18:13 -- common/autotest_common.sh@1142 -- # return 0 00:06:02.283 17:18:13 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:02.283 17:18:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:02.283 17:18:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:02.283 17:18:13 -- common/autotest_common.sh@10 -- # set +x 00:06:02.283 ************************************ 00:06:02.283 START TEST skip_rpc 00:06:02.283 ************************************ 00:06:02.283 17:18:13 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:02.283 * Looking for test storage... 00:06:02.283 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:02.283 17:18:13 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:02.283 17:18:13 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:02.283 17:18:13 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:02.283 17:18:13 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:02.283 17:18:13 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:02.283 17:18:13 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:02.283 ************************************ 00:06:02.283 START TEST skip_rpc 00:06:02.283 ************************************ 00:06:02.283 17:18:13 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:06:02.283 17:18:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=2693429 00:06:02.283 17:18:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:02.283 17:18:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:02.283 17:18:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:02.543 [2024-07-15 17:18:13.582277] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:02.543 [2024-07-15 17:18:13.582331] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2693429 ] 00:06:02.543 [2024-07-15 17:18:13.672433] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.543 [2024-07-15 17:18:13.751835] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:07.823 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 2693429 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 2693429 ']' 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 2693429 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2693429 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2693429' 00:06:07.824 killing process with pid 2693429 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 2693429 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 2693429 00:06:07.824 00:06:07.824 real 0m5.269s 00:06:07.824 user 0m5.027s 00:06:07.824 sys 0m0.260s 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:07.824 17:18:18 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:07.824 ************************************ 00:06:07.824 END TEST skip_rpc 00:06:07.824 ************************************ 00:06:07.824 17:18:18 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:07.824 17:18:18 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:07.824 17:18:18 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:07.824 17:18:18 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:07.824 17:18:18 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:07.824 ************************************ 00:06:07.824 START TEST skip_rpc_with_json 00:06:07.824 ************************************ 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=2694373 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 2694373 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 2694373 ']' 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:07.824 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:07.824 17:18:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:07.824 [2024-07-15 17:18:18.914966] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:07.824 [2024-07-15 17:18:18.915015] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2694373 ] 00:06:07.824 [2024-07-15 17:18:19.004470] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.824 [2024-07-15 17:18:19.072331] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:08.765 [2024-07-15 17:18:19.748819] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:08.765 request: 00:06:08.765 { 00:06:08.765 "trtype": "tcp", 00:06:08.765 "method": "nvmf_get_transports", 00:06:08.765 "req_id": 1 00:06:08.765 } 00:06:08.765 Got JSON-RPC error response 00:06:08.765 response: 00:06:08.765 { 00:06:08.765 "code": -19, 00:06:08.765 "message": "No such device" 00:06:08.765 } 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:08.765 [2024-07-15 17:18:19.756924] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:08.765 17:18:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:08.765 { 00:06:08.765 "subsystems": [ 00:06:08.765 { 00:06:08.765 "subsystem": "keyring", 00:06:08.765 "config": [] 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "subsystem": "iobuf", 00:06:08.765 "config": [ 00:06:08.765 { 00:06:08.765 "method": "iobuf_set_options", 00:06:08.765 "params": { 00:06:08.765 "small_pool_count": 8192, 00:06:08.765 "large_pool_count": 1024, 00:06:08.765 "small_bufsize": 8192, 00:06:08.765 "large_bufsize": 135168 00:06:08.765 } 00:06:08.765 } 00:06:08.765 ] 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "subsystem": "sock", 00:06:08.765 "config": [ 00:06:08.765 { 00:06:08.765 "method": "sock_set_default_impl", 00:06:08.765 "params": { 00:06:08.765 "impl_name": "posix" 00:06:08.765 } 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "method": "sock_impl_set_options", 00:06:08.765 "params": { 00:06:08.765 "impl_name": "ssl", 00:06:08.765 "recv_buf_size": 4096, 00:06:08.765 "send_buf_size": 4096, 00:06:08.765 "enable_recv_pipe": true, 00:06:08.765 "enable_quickack": false, 00:06:08.765 "enable_placement_id": 0, 00:06:08.765 "enable_zerocopy_send_server": true, 00:06:08.765 "enable_zerocopy_send_client": false, 00:06:08.765 "zerocopy_threshold": 0, 00:06:08.765 "tls_version": 0, 00:06:08.765 "enable_ktls": false 00:06:08.765 } 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "method": "sock_impl_set_options", 00:06:08.765 "params": { 00:06:08.765 "impl_name": "posix", 00:06:08.765 "recv_buf_size": 2097152, 00:06:08.765 "send_buf_size": 2097152, 00:06:08.765 "enable_recv_pipe": true, 00:06:08.765 "enable_quickack": false, 00:06:08.765 "enable_placement_id": 0, 00:06:08.765 "enable_zerocopy_send_server": true, 00:06:08.765 "enable_zerocopy_send_client": false, 00:06:08.765 "zerocopy_threshold": 0, 00:06:08.765 "tls_version": 0, 00:06:08.765 "enable_ktls": false 00:06:08.765 } 00:06:08.765 } 00:06:08.765 ] 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "subsystem": "vmd", 00:06:08.765 "config": [] 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "subsystem": "accel", 00:06:08.765 "config": [ 00:06:08.765 { 00:06:08.765 "method": "accel_set_options", 00:06:08.765 "params": { 00:06:08.765 "small_cache_size": 128, 00:06:08.765 "large_cache_size": 16, 00:06:08.765 "task_count": 2048, 00:06:08.765 "sequence_count": 2048, 00:06:08.765 "buf_count": 2048 00:06:08.765 } 00:06:08.765 } 00:06:08.765 ] 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "subsystem": "bdev", 00:06:08.765 "config": [ 00:06:08.765 { 00:06:08.765 "method": "bdev_set_options", 00:06:08.765 "params": { 00:06:08.765 "bdev_io_pool_size": 65535, 00:06:08.765 "bdev_io_cache_size": 256, 00:06:08.765 "bdev_auto_examine": true, 00:06:08.765 "iobuf_small_cache_size": 128, 00:06:08.765 "iobuf_large_cache_size": 16 00:06:08.765 } 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "method": "bdev_raid_set_options", 00:06:08.765 "params": { 00:06:08.765 "process_window_size_kb": 1024 00:06:08.765 } 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "method": "bdev_iscsi_set_options", 00:06:08.765 "params": { 00:06:08.765 "timeout_sec": 30 00:06:08.765 } 00:06:08.765 }, 00:06:08.765 { 00:06:08.765 "method": "bdev_nvme_set_options", 00:06:08.765 "params": { 00:06:08.766 "action_on_timeout": "none", 00:06:08.766 "timeout_us": 0, 00:06:08.766 "timeout_admin_us": 0, 00:06:08.766 "keep_alive_timeout_ms": 10000, 00:06:08.766 "arbitration_burst": 0, 00:06:08.766 "low_priority_weight": 0, 00:06:08.766 "medium_priority_weight": 0, 00:06:08.766 "high_priority_weight": 0, 00:06:08.766 "nvme_adminq_poll_period_us": 10000, 00:06:08.766 "nvme_ioq_poll_period_us": 0, 00:06:08.766 "io_queue_requests": 0, 00:06:08.766 "delay_cmd_submit": true, 00:06:08.766 "transport_retry_count": 4, 00:06:08.766 "bdev_retry_count": 3, 00:06:08.766 "transport_ack_timeout": 0, 00:06:08.766 "ctrlr_loss_timeout_sec": 0, 00:06:08.766 "reconnect_delay_sec": 0, 00:06:08.766 "fast_io_fail_timeout_sec": 0, 00:06:08.766 "disable_auto_failback": false, 00:06:08.766 "generate_uuids": false, 00:06:08.766 "transport_tos": 0, 00:06:08.766 "nvme_error_stat": false, 00:06:08.766 "rdma_srq_size": 0, 00:06:08.766 "io_path_stat": false, 00:06:08.766 "allow_accel_sequence": false, 00:06:08.766 "rdma_max_cq_size": 0, 00:06:08.766 "rdma_cm_event_timeout_ms": 0, 00:06:08.766 "dhchap_digests": [ 00:06:08.766 "sha256", 00:06:08.766 "sha384", 00:06:08.766 "sha512" 00:06:08.766 ], 00:06:08.766 "dhchap_dhgroups": [ 00:06:08.766 "null", 00:06:08.766 "ffdhe2048", 00:06:08.766 "ffdhe3072", 00:06:08.766 "ffdhe4096", 00:06:08.766 "ffdhe6144", 00:06:08.766 "ffdhe8192" 00:06:08.766 ] 00:06:08.766 } 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "method": "bdev_nvme_set_hotplug", 00:06:08.766 "params": { 00:06:08.766 "period_us": 100000, 00:06:08.766 "enable": false 00:06:08.766 } 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "method": "bdev_wait_for_examine" 00:06:08.766 } 00:06:08.766 ] 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "subsystem": "scsi", 00:06:08.766 "config": null 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "subsystem": "scheduler", 00:06:08.766 "config": [ 00:06:08.766 { 00:06:08.766 "method": "framework_set_scheduler", 00:06:08.766 "params": { 00:06:08.766 "name": "static" 00:06:08.766 } 00:06:08.766 } 00:06:08.766 ] 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "subsystem": "vhost_scsi", 00:06:08.766 "config": [] 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "subsystem": "vhost_blk", 00:06:08.766 "config": [] 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "subsystem": "ublk", 00:06:08.766 "config": [] 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "subsystem": "nbd", 00:06:08.766 "config": [] 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "subsystem": "nvmf", 00:06:08.766 "config": [ 00:06:08.766 { 00:06:08.766 "method": "nvmf_set_config", 00:06:08.766 "params": { 00:06:08.766 "discovery_filter": "match_any", 00:06:08.766 "admin_cmd_passthru": { 00:06:08.766 "identify_ctrlr": false 00:06:08.766 } 00:06:08.766 } 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "method": "nvmf_set_max_subsystems", 00:06:08.766 "params": { 00:06:08.766 "max_subsystems": 1024 00:06:08.766 } 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "method": "nvmf_set_crdt", 00:06:08.766 "params": { 00:06:08.766 "crdt1": 0, 00:06:08.766 "crdt2": 0, 00:06:08.766 "crdt3": 0 00:06:08.766 } 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "method": "nvmf_create_transport", 00:06:08.766 "params": { 00:06:08.766 "trtype": "TCP", 00:06:08.766 "max_queue_depth": 128, 00:06:08.766 "max_io_qpairs_per_ctrlr": 127, 00:06:08.766 "in_capsule_data_size": 4096, 00:06:08.766 "max_io_size": 131072, 00:06:08.766 "io_unit_size": 131072, 00:06:08.766 "max_aq_depth": 128, 00:06:08.766 "num_shared_buffers": 511, 00:06:08.766 "buf_cache_size": 4294967295, 00:06:08.766 "dif_insert_or_strip": false, 00:06:08.766 "zcopy": false, 00:06:08.766 "c2h_success": true, 00:06:08.766 "sock_priority": 0, 00:06:08.766 "abort_timeout_sec": 1, 00:06:08.766 "ack_timeout": 0, 00:06:08.766 "data_wr_pool_size": 0 00:06:08.766 } 00:06:08.766 } 00:06:08.766 ] 00:06:08.766 }, 00:06:08.766 { 00:06:08.766 "subsystem": "iscsi", 00:06:08.766 "config": [ 00:06:08.766 { 00:06:08.766 "method": "iscsi_set_options", 00:06:08.766 "params": { 00:06:08.766 "node_base": "iqn.2016-06.io.spdk", 00:06:08.766 "max_sessions": 128, 00:06:08.766 "max_connections_per_session": 2, 00:06:08.766 "max_queue_depth": 64, 00:06:08.766 "default_time2wait": 2, 00:06:08.766 "default_time2retain": 20, 00:06:08.766 "first_burst_length": 8192, 00:06:08.766 "immediate_data": true, 00:06:08.766 "allow_duplicated_isid": false, 00:06:08.766 "error_recovery_level": 0, 00:06:08.766 "nop_timeout": 60, 00:06:08.766 "nop_in_interval": 30, 00:06:08.766 "disable_chap": false, 00:06:08.766 "require_chap": false, 00:06:08.766 "mutual_chap": false, 00:06:08.766 "chap_group": 0, 00:06:08.766 "max_large_datain_per_connection": 64, 00:06:08.766 "max_r2t_per_connection": 4, 00:06:08.766 "pdu_pool_size": 36864, 00:06:08.766 "immediate_data_pool_size": 16384, 00:06:08.766 "data_out_pool_size": 2048 00:06:08.766 } 00:06:08.766 } 00:06:08.766 ] 00:06:08.766 } 00:06:08.766 ] 00:06:08.766 } 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 2694373 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2694373 ']' 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2694373 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2694373 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2694373' 00:06:08.766 killing process with pid 2694373 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2694373 00:06:08.766 17:18:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2694373 00:06:09.027 17:18:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=2694664 00:06:09.027 17:18:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:09.027 17:18:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 2694664 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2694664 ']' 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2694664 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2694664 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2694664' 00:06:14.314 killing process with pid 2694664 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2694664 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2694664 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:14.314 00:06:14.314 real 0m6.584s 00:06:14.314 user 0m6.481s 00:06:14.314 sys 0m0.558s 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:14.314 ************************************ 00:06:14.314 END TEST skip_rpc_with_json 00:06:14.314 ************************************ 00:06:14.314 17:18:25 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:14.314 17:18:25 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:14.314 17:18:25 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.314 17:18:25 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.314 17:18:25 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:14.314 ************************************ 00:06:14.314 START TEST skip_rpc_with_delay 00:06:14.314 ************************************ 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:14.314 [2024-07-15 17:18:25.588315] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:14.314 [2024-07-15 17:18:25.588389] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:14.314 00:06:14.314 real 0m0.089s 00:06:14.314 user 0m0.061s 00:06:14.314 sys 0m0.027s 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:14.314 17:18:25 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:14.314 ************************************ 00:06:14.314 END TEST skip_rpc_with_delay 00:06:14.314 ************************************ 00:06:14.574 17:18:25 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:14.574 17:18:25 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:14.574 17:18:25 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:14.574 17:18:25 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:14.574 17:18:25 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.574 17:18:25 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.574 17:18:25 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:14.574 ************************************ 00:06:14.574 START TEST exit_on_failed_rpc_init 00:06:14.574 ************************************ 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=2695636 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 2695636 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 2695636 ']' 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.574 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.574 17:18:25 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:14.574 [2024-07-15 17:18:25.746039] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:14.574 [2024-07-15 17:18:25.746093] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2695636 ] 00:06:14.574 [2024-07-15 17:18:25.836572] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.834 [2024-07-15 17:18:25.903617] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:15.402 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:15.402 [2024-07-15 17:18:26.645077] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:15.402 [2024-07-15 17:18:26.645124] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2695855 ] 00:06:15.661 [2024-07-15 17:18:26.718392] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.661 [2024-07-15 17:18:26.788339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:15.661 [2024-07-15 17:18:26.788419] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:15.661 [2024-07-15 17:18:26.788434] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:15.661 [2024-07-15 17:18:26.788441] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 2695636 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 2695636 ']' 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 2695636 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2695636 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2695636' 00:06:15.661 killing process with pid 2695636 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 2695636 00:06:15.661 17:18:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 2695636 00:06:15.922 00:06:15.922 real 0m1.425s 00:06:15.922 user 0m1.723s 00:06:15.922 sys 0m0.385s 00:06:15.922 17:18:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:15.922 17:18:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:15.922 ************************************ 00:06:15.922 END TEST exit_on_failed_rpc_init 00:06:15.922 ************************************ 00:06:15.922 17:18:27 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:15.922 17:18:27 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:15.922 00:06:15.922 real 0m13.781s 00:06:15.922 user 0m13.450s 00:06:15.922 sys 0m1.509s 00:06:15.922 17:18:27 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:15.922 17:18:27 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:15.922 ************************************ 00:06:15.922 END TEST skip_rpc 00:06:15.922 ************************************ 00:06:15.922 17:18:27 -- common/autotest_common.sh@1142 -- # return 0 00:06:15.922 17:18:27 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:15.922 17:18:27 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:15.922 17:18:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.922 17:18:27 -- common/autotest_common.sh@10 -- # set +x 00:06:16.182 ************************************ 00:06:16.182 START TEST rpc_client 00:06:16.182 ************************************ 00:06:16.182 17:18:27 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:16.182 * Looking for test storage... 00:06:16.182 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:16.182 17:18:27 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:16.182 OK 00:06:16.182 17:18:27 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:16.182 00:06:16.182 real 0m0.129s 00:06:16.182 user 0m0.069s 00:06:16.182 sys 0m0.068s 00:06:16.183 17:18:27 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:16.183 17:18:27 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:16.183 ************************************ 00:06:16.183 END TEST rpc_client 00:06:16.183 ************************************ 00:06:16.183 17:18:27 -- common/autotest_common.sh@1142 -- # return 0 00:06:16.183 17:18:27 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:16.183 17:18:27 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:16.183 17:18:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.183 17:18:27 -- common/autotest_common.sh@10 -- # set +x 00:06:16.183 ************************************ 00:06:16.183 START TEST json_config 00:06:16.183 ************************************ 00:06:16.183 17:18:27 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:16.444 17:18:27 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:16.444 17:18:27 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:16.444 17:18:27 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:16.444 17:18:27 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:16.444 17:18:27 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:16.444 17:18:27 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:16.444 17:18:27 json_config -- paths/export.sh@5 -- # export PATH 00:06:16.444 17:18:27 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@47 -- # : 0 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:16.444 17:18:27 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:16.444 INFO: JSON configuration test init 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:16.444 17:18:27 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:16.445 17:18:27 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:16.445 17:18:27 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:16.445 17:18:27 json_config -- json_config/common.sh@9 -- # local app=target 00:06:16.445 17:18:27 json_config -- json_config/common.sh@10 -- # shift 00:06:16.445 17:18:27 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:16.445 17:18:27 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:16.445 17:18:27 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:16.445 17:18:27 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:16.445 17:18:27 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:16.445 17:18:27 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2696055 00:06:16.445 17:18:27 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:16.445 Waiting for target to run... 00:06:16.445 17:18:27 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:16.445 17:18:27 json_config -- json_config/common.sh@25 -- # waitforlisten 2696055 /var/tmp/spdk_tgt.sock 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@829 -- # '[' -z 2696055 ']' 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:16.445 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:16.445 17:18:27 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:16.445 [2024-07-15 17:18:27.598430] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:16.445 [2024-07-15 17:18:27.598503] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2696055 ] 00:06:16.704 [2024-07-15 17:18:27.914953] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.704 [2024-07-15 17:18:27.966528] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.281 17:18:28 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:17.281 17:18:28 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:17.281 17:18:28 json_config -- json_config/common.sh@26 -- # echo '' 00:06:17.281 00:06:17.281 17:18:28 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:17.281 17:18:28 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:17.281 17:18:28 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:17.281 17:18:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:17.281 17:18:28 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:17.281 17:18:28 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:17.282 17:18:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:17.541 17:18:28 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:17.541 17:18:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:17.800 [2024-07-15 17:18:28.897097] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:17.800 17:18:28 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:17.800 17:18:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:18.371 [2024-07-15 17:18:29.422397] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:18.371 17:18:29 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:18.371 17:18:29 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:18.371 17:18:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:18.371 17:18:29 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:18.371 17:18:29 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:18.371 17:18:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:18.631 [2024-07-15 17:18:29.690830] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:23.975 17:18:34 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:23.975 17:18:34 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:23.975 17:18:34 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:23.975 17:18:34 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:23.975 17:18:34 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:23.975 17:18:34 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:23.975 17:18:34 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:23.975 17:18:34 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:23.975 17:18:34 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:23.975 17:18:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:23.975 17:18:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:23.975 17:18:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:23.975 17:18:35 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:23.975 17:18:35 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:23.975 17:18:35 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:23.975 17:18:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:24.235 Nvme0n1p0 Nvme0n1p1 00:06:24.235 17:18:35 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:24.235 17:18:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:24.495 [2024-07-15 17:18:35.549465] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:24.495 [2024-07-15 17:18:35.549503] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:24.495 00:06:24.495 17:18:35 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:24.495 17:18:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:24.495 Malloc3 00:06:24.495 17:18:35 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:24.496 17:18:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:24.755 [2024-07-15 17:18:35.954563] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:24.755 [2024-07-15 17:18:35.954594] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:24.755 [2024-07-15 17:18:35.954607] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc49b20 00:06:24.755 [2024-07-15 17:18:35.954613] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:24.755 [2024-07-15 17:18:35.955828] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:24.755 [2024-07-15 17:18:35.955847] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:24.755 PTBdevFromMalloc3 00:06:24.755 17:18:35 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:24.755 17:18:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:25.324 Null0 00:06:25.324 17:18:36 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:25.324 17:18:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:25.584 Malloc0 00:06:25.584 17:18:36 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:25.584 17:18:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:25.584 Malloc1 00:06:25.844 17:18:36 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:25.844 17:18:36 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:25.844 102400+0 records in 00:06:25.844 102400+0 records out 00:06:25.844 104857600 bytes (105 MB, 100 MiB) copied, 0.122495 s, 856 MB/s 00:06:25.844 17:18:37 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:25.845 17:18:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:26.105 aio_disk 00:06:26.105 17:18:37 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:26.105 17:18:37 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:26.105 17:18:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:31.388 bd1ad06f-5f3a-41a7-bd6e-5dea4895cf33 00:06:31.388 17:18:41 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:31.388 17:18:41 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:31.388 17:18:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:31.388 17:18:42 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:31.388 17:18:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:31.648 17:18:42 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:31.648 17:18:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:32.217 17:18:43 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:32.217 17:18:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:32.784 17:18:43 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:32.784 17:18:43 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:32.784 17:18:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:33.352 MallocForCryptoBdev 00:06:33.352 17:18:44 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:33.352 17:18:44 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:33.352 17:18:44 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:33.352 17:18:44 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:33.352 17:18:44 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:33.352 17:18:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:33.920 [2024-07-15 17:18:44.946704] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:33.920 CryptoMallocBdev 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:3b04feed-1018-42b5-9ec2-540e7043bc06 bdev_register:b269e489-fccb-4e49-9a6c-1817f00a6abc bdev_register:adee328c-746e-4aa0-a011-4b75197f3f0a bdev_register:242b1f70-a14a-48c0-ba92-7f3cf0322a02 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@71 -- # sort 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:3b04feed-1018-42b5-9ec2-540e7043bc06 bdev_register:b269e489-fccb-4e49-9a6c-1817f00a6abc bdev_register:adee328c-746e-4aa0-a011-4b75197f3f0a bdev_register:242b1f70-a14a-48c0-ba92-7f3cf0322a02 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@72 -- # sort 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:33.920 17:18:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:33.920 17:18:44 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:3b04feed-1018-42b5-9ec2-540e7043bc06 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:b269e489-fccb-4e49-9a6c-1817f00a6abc 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:adee328c-746e-4aa0-a011-4b75197f3f0a 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:242b1f70-a14a-48c0-ba92-7f3cf0322a02 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:242b1f70-a14a-48c0-ba92-7f3cf0322a02 bdev_register:3b04feed-1018-42b5-9ec2-540e7043bc06 bdev_register:adee328c-746e-4aa0-a011-4b75197f3f0a bdev_register:aio_disk bdev_register:b269e489-fccb-4e49-9a6c-1817f00a6abc bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\2\4\2\b\1\f\7\0\-\a\1\4\a\-\4\8\c\0\-\b\a\9\2\-\7\f\3\c\f\0\3\2\2\a\0\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\3\b\0\4\f\e\e\d\-\1\0\1\8\-\4\2\b\5\-\9\e\c\2\-\5\4\0\e\7\0\4\3\b\c\0\6\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\d\e\e\3\2\8\c\-\7\4\6\e\-\4\a\a\0\-\a\0\1\1\-\4\b\7\5\1\9\7\f\3\f\0\a\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\b\2\6\9\e\4\8\9\-\f\c\c\b\-\4\e\4\9\-\9\a\6\c\-\1\8\1\7\f\0\0\a\6\a\b\c\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@86 -- # cat 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:242b1f70-a14a-48c0-ba92-7f3cf0322a02 bdev_register:3b04feed-1018-42b5-9ec2-540e7043bc06 bdev_register:adee328c-746e-4aa0-a011-4b75197f3f0a bdev_register:aio_disk bdev_register:b269e489-fccb-4e49-9a6c-1817f00a6abc bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:33.920 Expected events matched: 00:06:33.920 bdev_register:242b1f70-a14a-48c0-ba92-7f3cf0322a02 00:06:33.920 bdev_register:3b04feed-1018-42b5-9ec2-540e7043bc06 00:06:33.920 bdev_register:adee328c-746e-4aa0-a011-4b75197f3f0a 00:06:33.920 bdev_register:aio_disk 00:06:33.920 bdev_register:b269e489-fccb-4e49-9a6c-1817f00a6abc 00:06:33.920 bdev_register:CryptoMallocBdev 00:06:33.920 bdev_register:Malloc0 00:06:33.920 bdev_register:Malloc0p0 00:06:33.920 bdev_register:Malloc0p1 00:06:33.920 bdev_register:Malloc0p2 00:06:33.920 bdev_register:Malloc1 00:06:33.920 bdev_register:Malloc3 00:06:33.920 bdev_register:MallocForCryptoBdev 00:06:33.920 bdev_register:Null0 00:06:33.920 bdev_register:Nvme0n1 00:06:33.920 bdev_register:Nvme0n1p0 00:06:33.920 bdev_register:Nvme0n1p1 00:06:33.920 bdev_register:PTBdevFromMalloc3 00:06:33.920 17:18:45 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:33.920 17:18:45 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:33.920 17:18:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:34.180 17:18:45 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:34.180 17:18:45 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:34.180 17:18:45 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:34.180 17:18:45 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:34.180 17:18:45 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:34.180 17:18:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:34.180 17:18:45 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:34.180 17:18:45 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:34.180 17:18:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:34.180 MallocBdevForConfigChangeCheck 00:06:34.180 17:18:45 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:34.180 17:18:45 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:34.180 17:18:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:34.438 17:18:45 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:34.438 17:18:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:34.696 17:18:45 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:34.696 INFO: shutting down applications... 00:06:34.696 17:18:45 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:34.696 17:18:45 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:34.696 17:18:45 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:34.697 17:18:45 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:34.697 [2024-07-15 17:18:45.985749] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:37.990 Calling clear_iscsi_subsystem 00:06:37.990 Calling clear_nvmf_subsystem 00:06:37.990 Calling clear_nbd_subsystem 00:06:37.990 Calling clear_ublk_subsystem 00:06:37.990 Calling clear_vhost_blk_subsystem 00:06:37.990 Calling clear_vhost_scsi_subsystem 00:06:37.990 Calling clear_bdev_subsystem 00:06:37.990 17:18:48 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:37.990 17:18:48 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:37.990 17:18:48 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:37.990 17:18:48 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:37.990 17:18:48 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:37.990 17:18:48 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:37.990 17:18:48 json_config -- json_config/json_config.sh@345 -- # break 00:06:37.990 17:18:48 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:37.990 17:18:48 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:37.990 17:18:48 json_config -- json_config/common.sh@31 -- # local app=target 00:06:37.990 17:18:48 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:37.990 17:18:48 json_config -- json_config/common.sh@35 -- # [[ -n 2696055 ]] 00:06:37.990 17:18:48 json_config -- json_config/common.sh@38 -- # kill -SIGINT 2696055 00:06:37.990 17:18:48 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:37.990 17:18:48 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:37.990 17:18:48 json_config -- json_config/common.sh@41 -- # kill -0 2696055 00:06:37.990 17:18:48 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:38.250 17:18:49 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:38.250 17:18:49 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:38.250 17:18:49 json_config -- json_config/common.sh@41 -- # kill -0 2696055 00:06:38.250 17:18:49 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:38.250 17:18:49 json_config -- json_config/common.sh@43 -- # break 00:06:38.250 17:18:49 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:38.250 17:18:49 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:38.250 SPDK target shutdown done 00:06:38.250 17:18:49 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:38.250 INFO: relaunching applications... 00:06:38.250 17:18:49 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:38.250 17:18:49 json_config -- json_config/common.sh@9 -- # local app=target 00:06:38.250 17:18:49 json_config -- json_config/common.sh@10 -- # shift 00:06:38.250 17:18:49 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:38.250 17:18:49 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:38.250 17:18:49 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:38.250 17:18:49 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:38.250 17:18:49 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:38.250 17:18:49 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2699883 00:06:38.250 17:18:49 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:38.250 Waiting for target to run... 00:06:38.250 17:18:49 json_config -- json_config/common.sh@25 -- # waitforlisten 2699883 /var/tmp/spdk_tgt.sock 00:06:38.250 17:18:49 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:38.250 17:18:49 json_config -- common/autotest_common.sh@829 -- # '[' -z 2699883 ']' 00:06:38.250 17:18:49 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:38.250 17:18:49 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:38.250 17:18:49 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:38.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:38.250 17:18:49 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:38.250 17:18:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:38.250 [2024-07-15 17:18:49.480359] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:38.251 [2024-07-15 17:18:49.480410] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2699883 ] 00:06:38.821 [2024-07-15 17:18:49.819883] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.821 [2024-07-15 17:18:49.882357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.821 [2024-07-15 17:18:49.936217] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:38.821 [2024-07-15 17:18:49.944250] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:38.821 [2024-07-15 17:18:49.952267] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:38.821 [2024-07-15 17:18:50.032459] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:41.392 [2024-07-15 17:18:52.167988] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:41.392 [2024-07-15 17:18:52.168036] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:41.392 [2024-07-15 17:18:52.168044] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:41.392 [2024-07-15 17:18:52.176001] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:41.392 [2024-07-15 17:18:52.176019] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:41.392 [2024-07-15 17:18:52.184018] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:41.392 [2024-07-15 17:18:52.184034] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:41.392 [2024-07-15 17:18:52.192049] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:41.392 [2024-07-15 17:18:52.192067] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:41.392 [2024-07-15 17:18:52.192073] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:43.980 [2024-07-15 17:18:55.047368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:43.980 [2024-07-15 17:18:55.047403] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:43.980 [2024-07-15 17:18:55.047413] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xca7b20 00:06:43.980 [2024-07-15 17:18:55.047419] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:43.980 [2024-07-15 17:18:55.047647] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:43.980 [2024-07-15 17:18:55.047658] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:44.240 17:18:55 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:44.240 17:18:55 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:44.240 17:18:55 json_config -- json_config/common.sh@26 -- # echo '' 00:06:44.240 00:06:44.240 17:18:55 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:44.240 17:18:55 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:44.240 INFO: Checking if target configuration is the same... 00:06:44.240 17:18:55 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:44.240 17:18:55 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:44.240 17:18:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:44.240 + '[' 2 -ne 2 ']' 00:06:44.240 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:44.240 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:44.240 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:44.240 +++ basename /dev/fd/62 00:06:44.240 ++ mktemp /tmp/62.XXX 00:06:44.240 + tmp_file_1=/tmp/62.T1t 00:06:44.240 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:44.240 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:44.240 + tmp_file_2=/tmp/spdk_tgt_config.json.Guy 00:06:44.240 + ret=0 00:06:44.240 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:44.501 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:44.501 + diff -u /tmp/62.T1t /tmp/spdk_tgt_config.json.Guy 00:06:44.762 + echo 'INFO: JSON config files are the same' 00:06:44.762 INFO: JSON config files are the same 00:06:44.762 + rm /tmp/62.T1t /tmp/spdk_tgt_config.json.Guy 00:06:44.762 + exit 0 00:06:44.762 17:18:55 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:06:44.762 17:18:55 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:44.762 INFO: changing configuration and checking if this can be detected... 00:06:44.762 17:18:55 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:44.762 17:18:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:44.762 17:18:55 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:44.762 17:18:55 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:06:44.762 17:18:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:44.762 + '[' 2 -ne 2 ']' 00:06:44.762 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:44.762 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:44.762 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:44.762 +++ basename /dev/fd/62 00:06:44.762 ++ mktemp /tmp/62.XXX 00:06:44.762 + tmp_file_1=/tmp/62.v8r 00:06:44.762 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:44.762 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:44.762 + tmp_file_2=/tmp/spdk_tgt_config.json.GXy 00:06:44.762 + ret=0 00:06:44.762 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:45.023 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:45.284 + diff -u /tmp/62.v8r /tmp/spdk_tgt_config.json.GXy 00:06:45.284 + ret=1 00:06:45.284 + echo '=== Start of file: /tmp/62.v8r ===' 00:06:45.284 + cat /tmp/62.v8r 00:06:45.284 + echo '=== End of file: /tmp/62.v8r ===' 00:06:45.284 + echo '' 00:06:45.284 + echo '=== Start of file: /tmp/spdk_tgt_config.json.GXy ===' 00:06:45.284 + cat /tmp/spdk_tgt_config.json.GXy 00:06:45.284 + echo '=== End of file: /tmp/spdk_tgt_config.json.GXy ===' 00:06:45.284 + echo '' 00:06:45.284 + rm /tmp/62.v8r /tmp/spdk_tgt_config.json.GXy 00:06:45.284 + exit 1 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:06:45.284 INFO: configuration change detected. 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:06:45.284 17:18:56 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:45.284 17:18:56 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@317 -- # [[ -n 2699883 ]] 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:06:45.284 17:18:56 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:45.284 17:18:56 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:45.284 17:18:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:45.284 17:18:56 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:45.284 17:18:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:45.545 17:18:56 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:45.545 17:18:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:45.805 17:18:56 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:45.805 17:18:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:46.065 17:18:57 json_config -- json_config/json_config.sh@193 -- # uname -s 00:06:46.065 17:18:57 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:06:46.065 17:18:57 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:06:46.065 17:18:57 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:06:46.065 17:18:57 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:46.065 17:18:57 json_config -- json_config/json_config.sh@323 -- # killprocess 2699883 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@948 -- # '[' -z 2699883 ']' 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@952 -- # kill -0 2699883 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@953 -- # uname 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2699883 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2699883' 00:06:46.065 killing process with pid 2699883 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@967 -- # kill 2699883 00:06:46.065 17:18:57 json_config -- common/autotest_common.sh@972 -- # wait 2699883 00:06:48.609 17:18:59 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:48.609 17:18:59 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:06:48.609 17:18:59 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:48.609 17:18:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:48.609 17:18:59 json_config -- json_config/json_config.sh@328 -- # return 0 00:06:48.609 17:18:59 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:06:48.610 INFO: Success 00:06:48.610 00:06:48.610 real 0m32.383s 00:06:48.610 user 0m39.447s 00:06:48.610 sys 0m2.981s 00:06:48.610 17:18:59 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:48.610 17:18:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:48.610 ************************************ 00:06:48.610 END TEST json_config 00:06:48.610 ************************************ 00:06:48.610 17:18:59 -- common/autotest_common.sh@1142 -- # return 0 00:06:48.610 17:18:59 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:48.610 17:18:59 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:48.610 17:18:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.610 17:18:59 -- common/autotest_common.sh@10 -- # set +x 00:06:48.610 ************************************ 00:06:48.610 START TEST json_config_extra_key 00:06:48.610 ************************************ 00:06:48.610 17:18:59 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:48.871 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:48.871 17:18:59 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:48.871 17:18:59 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:48.871 17:18:59 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:48.871 17:18:59 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:48.871 17:18:59 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:48.871 17:18:59 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:48.871 17:18:59 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:48.871 17:18:59 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:48.871 17:18:59 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:48.872 17:18:59 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:48.872 17:18:59 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:48.872 17:18:59 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:48.872 17:18:59 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:48.872 17:18:59 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:48.872 17:18:59 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:48.872 INFO: launching applications... 00:06:48.872 17:18:59 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=2701832 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:48.872 Waiting for target to run... 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 2701832 /var/tmp/spdk_tgt.sock 00:06:48.872 17:18:59 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 2701832 ']' 00:06:48.872 17:18:59 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:48.872 17:18:59 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:48.872 17:18:59 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:48.872 17:18:59 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:48.872 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:48.872 17:18:59 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:48.872 17:18:59 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:48.872 [2024-07-15 17:19:00.057922] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:48.872 [2024-07-15 17:19:00.057981] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2701832 ] 00:06:49.133 [2024-07-15 17:19:00.394478] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.393 [2024-07-15 17:19:00.448747] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.654 17:19:00 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:49.654 17:19:00 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:06:49.654 17:19:00 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:49.654 00:06:49.654 17:19:00 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:49.654 INFO: shutting down applications... 00:06:49.654 17:19:00 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:49.654 17:19:00 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:49.654 17:19:00 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:49.654 17:19:00 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 2701832 ]] 00:06:49.654 17:19:00 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 2701832 00:06:49.654 17:19:00 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:49.654 17:19:00 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:49.654 17:19:00 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2701832 00:06:49.654 17:19:00 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:50.224 17:19:01 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:50.224 17:19:01 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:50.224 17:19:01 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2701832 00:06:50.224 17:19:01 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:50.224 17:19:01 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:50.224 17:19:01 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:50.224 17:19:01 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:50.224 SPDK target shutdown done 00:06:50.224 17:19:01 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:50.224 Success 00:06:50.224 00:06:50.224 real 0m1.497s 00:06:50.224 user 0m1.039s 00:06:50.224 sys 0m0.444s 00:06:50.224 17:19:01 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.224 17:19:01 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:50.224 ************************************ 00:06:50.224 END TEST json_config_extra_key 00:06:50.224 ************************************ 00:06:50.224 17:19:01 -- common/autotest_common.sh@1142 -- # return 0 00:06:50.225 17:19:01 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:50.225 17:19:01 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:50.225 17:19:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.225 17:19:01 -- common/autotest_common.sh@10 -- # set +x 00:06:50.225 ************************************ 00:06:50.225 START TEST alias_rpc 00:06:50.225 ************************************ 00:06:50.225 17:19:01 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:50.485 * Looking for test storage... 00:06:50.485 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:50.485 17:19:01 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:50.485 17:19:01 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2702185 00:06:50.485 17:19:01 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2702185 00:06:50.485 17:19:01 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 2702185 ']' 00:06:50.485 17:19:01 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:50.485 17:19:01 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:50.485 17:19:01 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:50.485 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:50.485 17:19:01 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:50.485 17:19:01 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.485 17:19:01 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.485 [2024-07-15 17:19:01.674220] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:50.485 [2024-07-15 17:19:01.674348] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2702185 ] 00:06:50.745 [2024-07-15 17:19:01.816796] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.745 [2024-07-15 17:19:01.893044] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.685 17:19:02 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:51.685 17:19:02 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:06:51.685 17:19:02 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:52.253 17:19:03 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2702185 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 2702185 ']' 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 2702185 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2702185 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2702185' 00:06:52.253 killing process with pid 2702185 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@967 -- # kill 2702185 00:06:52.253 17:19:03 alias_rpc -- common/autotest_common.sh@972 -- # wait 2702185 00:06:52.512 00:06:52.512 real 0m2.174s 00:06:52.512 user 0m2.955s 00:06:52.512 sys 0m0.525s 00:06:52.512 17:19:03 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:52.512 17:19:03 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:52.512 ************************************ 00:06:52.512 END TEST alias_rpc 00:06:52.512 ************************************ 00:06:52.512 17:19:03 -- common/autotest_common.sh@1142 -- # return 0 00:06:52.512 17:19:03 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:06:52.512 17:19:03 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:52.512 17:19:03 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:52.512 17:19:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:52.512 17:19:03 -- common/autotest_common.sh@10 -- # set +x 00:06:52.512 ************************************ 00:06:52.512 START TEST spdkcli_tcp 00:06:52.512 ************************************ 00:06:52.512 17:19:03 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:52.512 * Looking for test storage... 00:06:52.512 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:52.512 17:19:03 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:52.513 17:19:03 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:52.513 17:19:03 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:52.513 17:19:03 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:52.513 17:19:03 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:52.513 17:19:03 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:52.513 17:19:03 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:52.513 17:19:03 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:52.513 17:19:03 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:52.772 17:19:03 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2702552 00:06:52.772 17:19:03 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 2702552 00:06:52.772 17:19:03 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:52.772 17:19:03 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 2702552 ']' 00:06:52.772 17:19:03 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:52.772 17:19:03 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:52.772 17:19:03 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:52.772 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:52.772 17:19:03 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:52.772 17:19:03 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:52.772 [2024-07-15 17:19:03.875275] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:52.772 [2024-07-15 17:19:03.875343] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2702552 ] 00:06:52.772 [2024-07-15 17:19:03.967249] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:52.772 [2024-07-15 17:19:04.037071] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:52.772 [2024-07-15 17:19:04.037077] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.712 17:19:04 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:53.712 17:19:04 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:06:53.712 17:19:04 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=2702719 00:06:53.712 17:19:04 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:53.712 17:19:04 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:53.712 [ 00:06:53.712 "bdev_malloc_delete", 00:06:53.712 "bdev_malloc_create", 00:06:53.712 "bdev_null_resize", 00:06:53.712 "bdev_null_delete", 00:06:53.712 "bdev_null_create", 00:06:53.712 "bdev_nvme_cuse_unregister", 00:06:53.712 "bdev_nvme_cuse_register", 00:06:53.712 "bdev_opal_new_user", 00:06:53.712 "bdev_opal_set_lock_state", 00:06:53.712 "bdev_opal_delete", 00:06:53.712 "bdev_opal_get_info", 00:06:53.712 "bdev_opal_create", 00:06:53.712 "bdev_nvme_opal_revert", 00:06:53.712 "bdev_nvme_opal_init", 00:06:53.712 "bdev_nvme_send_cmd", 00:06:53.712 "bdev_nvme_get_path_iostat", 00:06:53.712 "bdev_nvme_get_mdns_discovery_info", 00:06:53.712 "bdev_nvme_stop_mdns_discovery", 00:06:53.712 "bdev_nvme_start_mdns_discovery", 00:06:53.712 "bdev_nvme_set_multipath_policy", 00:06:53.712 "bdev_nvme_set_preferred_path", 00:06:53.712 "bdev_nvme_get_io_paths", 00:06:53.712 "bdev_nvme_remove_error_injection", 00:06:53.712 "bdev_nvme_add_error_injection", 00:06:53.712 "bdev_nvme_get_discovery_info", 00:06:53.712 "bdev_nvme_stop_discovery", 00:06:53.712 "bdev_nvme_start_discovery", 00:06:53.712 "bdev_nvme_get_controller_health_info", 00:06:53.712 "bdev_nvme_disable_controller", 00:06:53.712 "bdev_nvme_enable_controller", 00:06:53.712 "bdev_nvme_reset_controller", 00:06:53.712 "bdev_nvme_get_transport_statistics", 00:06:53.712 "bdev_nvme_apply_firmware", 00:06:53.712 "bdev_nvme_detach_controller", 00:06:53.712 "bdev_nvme_get_controllers", 00:06:53.712 "bdev_nvme_attach_controller", 00:06:53.712 "bdev_nvme_set_hotplug", 00:06:53.712 "bdev_nvme_set_options", 00:06:53.712 "bdev_passthru_delete", 00:06:53.712 "bdev_passthru_create", 00:06:53.712 "bdev_lvol_set_parent_bdev", 00:06:53.712 "bdev_lvol_set_parent", 00:06:53.712 "bdev_lvol_check_shallow_copy", 00:06:53.712 "bdev_lvol_start_shallow_copy", 00:06:53.712 "bdev_lvol_grow_lvstore", 00:06:53.712 "bdev_lvol_get_lvols", 00:06:53.712 "bdev_lvol_get_lvstores", 00:06:53.712 "bdev_lvol_delete", 00:06:53.712 "bdev_lvol_set_read_only", 00:06:53.712 "bdev_lvol_resize", 00:06:53.712 "bdev_lvol_decouple_parent", 00:06:53.712 "bdev_lvol_inflate", 00:06:53.712 "bdev_lvol_rename", 00:06:53.712 "bdev_lvol_clone_bdev", 00:06:53.712 "bdev_lvol_clone", 00:06:53.712 "bdev_lvol_snapshot", 00:06:53.712 "bdev_lvol_create", 00:06:53.712 "bdev_lvol_delete_lvstore", 00:06:53.712 "bdev_lvol_rename_lvstore", 00:06:53.712 "bdev_lvol_create_lvstore", 00:06:53.712 "bdev_raid_set_options", 00:06:53.712 "bdev_raid_remove_base_bdev", 00:06:53.712 "bdev_raid_add_base_bdev", 00:06:53.712 "bdev_raid_delete", 00:06:53.712 "bdev_raid_create", 00:06:53.712 "bdev_raid_get_bdevs", 00:06:53.712 "bdev_error_inject_error", 00:06:53.712 "bdev_error_delete", 00:06:53.712 "bdev_error_create", 00:06:53.712 "bdev_split_delete", 00:06:53.712 "bdev_split_create", 00:06:53.712 "bdev_delay_delete", 00:06:53.712 "bdev_delay_create", 00:06:53.712 "bdev_delay_update_latency", 00:06:53.712 "bdev_zone_block_delete", 00:06:53.712 "bdev_zone_block_create", 00:06:53.712 "blobfs_create", 00:06:53.712 "blobfs_detect", 00:06:53.712 "blobfs_set_cache_size", 00:06:53.712 "bdev_crypto_delete", 00:06:53.712 "bdev_crypto_create", 00:06:53.712 "bdev_compress_delete", 00:06:53.712 "bdev_compress_create", 00:06:53.712 "bdev_compress_get_orphans", 00:06:53.712 "bdev_aio_delete", 00:06:53.712 "bdev_aio_rescan", 00:06:53.712 "bdev_aio_create", 00:06:53.712 "bdev_ftl_set_property", 00:06:53.712 "bdev_ftl_get_properties", 00:06:53.712 "bdev_ftl_get_stats", 00:06:53.712 "bdev_ftl_unmap", 00:06:53.712 "bdev_ftl_unload", 00:06:53.712 "bdev_ftl_delete", 00:06:53.712 "bdev_ftl_load", 00:06:53.712 "bdev_ftl_create", 00:06:53.712 "bdev_virtio_attach_controller", 00:06:53.712 "bdev_virtio_scsi_get_devices", 00:06:53.712 "bdev_virtio_detach_controller", 00:06:53.712 "bdev_virtio_blk_set_hotplug", 00:06:53.712 "bdev_iscsi_delete", 00:06:53.712 "bdev_iscsi_create", 00:06:53.712 "bdev_iscsi_set_options", 00:06:53.712 "accel_error_inject_error", 00:06:53.712 "ioat_scan_accel_module", 00:06:53.712 "dsa_scan_accel_module", 00:06:53.712 "iaa_scan_accel_module", 00:06:53.712 "dpdk_cryptodev_get_driver", 00:06:53.712 "dpdk_cryptodev_set_driver", 00:06:53.712 "dpdk_cryptodev_scan_accel_module", 00:06:53.712 "compressdev_scan_accel_module", 00:06:53.712 "keyring_file_remove_key", 00:06:53.712 "keyring_file_add_key", 00:06:53.712 "keyring_linux_set_options", 00:06:53.712 "iscsi_get_histogram", 00:06:53.712 "iscsi_enable_histogram", 00:06:53.713 "iscsi_set_options", 00:06:53.713 "iscsi_get_auth_groups", 00:06:53.713 "iscsi_auth_group_remove_secret", 00:06:53.713 "iscsi_auth_group_add_secret", 00:06:53.713 "iscsi_delete_auth_group", 00:06:53.713 "iscsi_create_auth_group", 00:06:53.713 "iscsi_set_discovery_auth", 00:06:53.713 "iscsi_get_options", 00:06:53.713 "iscsi_target_node_request_logout", 00:06:53.713 "iscsi_target_node_set_redirect", 00:06:53.713 "iscsi_target_node_set_auth", 00:06:53.713 "iscsi_target_node_add_lun", 00:06:53.713 "iscsi_get_stats", 00:06:53.713 "iscsi_get_connections", 00:06:53.713 "iscsi_portal_group_set_auth", 00:06:53.713 "iscsi_start_portal_group", 00:06:53.713 "iscsi_delete_portal_group", 00:06:53.713 "iscsi_create_portal_group", 00:06:53.713 "iscsi_get_portal_groups", 00:06:53.713 "iscsi_delete_target_node", 00:06:53.713 "iscsi_target_node_remove_pg_ig_maps", 00:06:53.713 "iscsi_target_node_add_pg_ig_maps", 00:06:53.713 "iscsi_create_target_node", 00:06:53.713 "iscsi_get_target_nodes", 00:06:53.713 "iscsi_delete_initiator_group", 00:06:53.713 "iscsi_initiator_group_remove_initiators", 00:06:53.713 "iscsi_initiator_group_add_initiators", 00:06:53.713 "iscsi_create_initiator_group", 00:06:53.713 "iscsi_get_initiator_groups", 00:06:53.713 "nvmf_set_crdt", 00:06:53.713 "nvmf_set_config", 00:06:53.713 "nvmf_set_max_subsystems", 00:06:53.713 "nvmf_stop_mdns_prr", 00:06:53.713 "nvmf_publish_mdns_prr", 00:06:53.713 "nvmf_subsystem_get_listeners", 00:06:53.713 "nvmf_subsystem_get_qpairs", 00:06:53.713 "nvmf_subsystem_get_controllers", 00:06:53.713 "nvmf_get_stats", 00:06:53.713 "nvmf_get_transports", 00:06:53.713 "nvmf_create_transport", 00:06:53.713 "nvmf_get_targets", 00:06:53.713 "nvmf_delete_target", 00:06:53.713 "nvmf_create_target", 00:06:53.713 "nvmf_subsystem_allow_any_host", 00:06:53.713 "nvmf_subsystem_remove_host", 00:06:53.713 "nvmf_subsystem_add_host", 00:06:53.713 "nvmf_ns_remove_host", 00:06:53.713 "nvmf_ns_add_host", 00:06:53.713 "nvmf_subsystem_remove_ns", 00:06:53.713 "nvmf_subsystem_add_ns", 00:06:53.713 "nvmf_subsystem_listener_set_ana_state", 00:06:53.713 "nvmf_discovery_get_referrals", 00:06:53.713 "nvmf_discovery_remove_referral", 00:06:53.713 "nvmf_discovery_add_referral", 00:06:53.713 "nvmf_subsystem_remove_listener", 00:06:53.713 "nvmf_subsystem_add_listener", 00:06:53.713 "nvmf_delete_subsystem", 00:06:53.713 "nvmf_create_subsystem", 00:06:53.713 "nvmf_get_subsystems", 00:06:53.713 "env_dpdk_get_mem_stats", 00:06:53.713 "nbd_get_disks", 00:06:53.713 "nbd_stop_disk", 00:06:53.713 "nbd_start_disk", 00:06:53.713 "ublk_recover_disk", 00:06:53.713 "ublk_get_disks", 00:06:53.713 "ublk_stop_disk", 00:06:53.713 "ublk_start_disk", 00:06:53.713 "ublk_destroy_target", 00:06:53.713 "ublk_create_target", 00:06:53.713 "virtio_blk_create_transport", 00:06:53.713 "virtio_blk_get_transports", 00:06:53.713 "vhost_controller_set_coalescing", 00:06:53.713 "vhost_get_controllers", 00:06:53.713 "vhost_delete_controller", 00:06:53.713 "vhost_create_blk_controller", 00:06:53.713 "vhost_scsi_controller_remove_target", 00:06:53.713 "vhost_scsi_controller_add_target", 00:06:53.713 "vhost_start_scsi_controller", 00:06:53.713 "vhost_create_scsi_controller", 00:06:53.713 "thread_set_cpumask", 00:06:53.713 "framework_get_governor", 00:06:53.713 "framework_get_scheduler", 00:06:53.713 "framework_set_scheduler", 00:06:53.713 "framework_get_reactors", 00:06:53.713 "thread_get_io_channels", 00:06:53.713 "thread_get_pollers", 00:06:53.713 "thread_get_stats", 00:06:53.713 "framework_monitor_context_switch", 00:06:53.713 "spdk_kill_instance", 00:06:53.713 "log_enable_timestamps", 00:06:53.713 "log_get_flags", 00:06:53.713 "log_clear_flag", 00:06:53.713 "log_set_flag", 00:06:53.713 "log_get_level", 00:06:53.713 "log_set_level", 00:06:53.713 "log_get_print_level", 00:06:53.713 "log_set_print_level", 00:06:53.713 "framework_enable_cpumask_locks", 00:06:53.713 "framework_disable_cpumask_locks", 00:06:53.713 "framework_wait_init", 00:06:53.713 "framework_start_init", 00:06:53.713 "scsi_get_devices", 00:06:53.713 "bdev_get_histogram", 00:06:53.713 "bdev_enable_histogram", 00:06:53.713 "bdev_set_qos_limit", 00:06:53.713 "bdev_set_qd_sampling_period", 00:06:53.713 "bdev_get_bdevs", 00:06:53.713 "bdev_reset_iostat", 00:06:53.713 "bdev_get_iostat", 00:06:53.713 "bdev_examine", 00:06:53.713 "bdev_wait_for_examine", 00:06:53.713 "bdev_set_options", 00:06:53.713 "notify_get_notifications", 00:06:53.713 "notify_get_types", 00:06:53.713 "accel_get_stats", 00:06:53.713 "accel_set_options", 00:06:53.713 "accel_set_driver", 00:06:53.713 "accel_crypto_key_destroy", 00:06:53.713 "accel_crypto_keys_get", 00:06:53.713 "accel_crypto_key_create", 00:06:53.713 "accel_assign_opc", 00:06:53.713 "accel_get_module_info", 00:06:53.713 "accel_get_opc_assignments", 00:06:53.713 "vmd_rescan", 00:06:53.713 "vmd_remove_device", 00:06:53.713 "vmd_enable", 00:06:53.713 "sock_get_default_impl", 00:06:53.713 "sock_set_default_impl", 00:06:53.713 "sock_impl_set_options", 00:06:53.713 "sock_impl_get_options", 00:06:53.713 "iobuf_get_stats", 00:06:53.713 "iobuf_set_options", 00:06:53.713 "framework_get_pci_devices", 00:06:53.713 "framework_get_config", 00:06:53.713 "framework_get_subsystems", 00:06:53.713 "trace_get_info", 00:06:53.713 "trace_get_tpoint_group_mask", 00:06:53.713 "trace_disable_tpoint_group", 00:06:53.713 "trace_enable_tpoint_group", 00:06:53.713 "trace_clear_tpoint_mask", 00:06:53.713 "trace_set_tpoint_mask", 00:06:53.713 "keyring_get_keys", 00:06:53.713 "spdk_get_version", 00:06:53.713 "rpc_get_methods" 00:06:53.713 ] 00:06:53.713 17:19:04 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:53.713 17:19:04 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:53.713 17:19:04 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:53.713 17:19:04 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:53.713 17:19:04 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 2702552 00:06:53.713 17:19:04 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 2702552 ']' 00:06:53.713 17:19:04 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 2702552 00:06:53.713 17:19:04 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:06:53.713 17:19:04 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:53.713 17:19:04 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2702552 00:06:53.974 17:19:05 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:53.974 17:19:05 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:53.974 17:19:05 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2702552' 00:06:53.974 killing process with pid 2702552 00:06:53.974 17:19:05 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 2702552 00:06:53.974 17:19:05 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 2702552 00:06:53.974 00:06:53.974 real 0m1.511s 00:06:53.974 user 0m2.829s 00:06:53.974 sys 0m0.459s 00:06:53.974 17:19:05 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:53.974 17:19:05 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:53.974 ************************************ 00:06:53.974 END TEST spdkcli_tcp 00:06:53.974 ************************************ 00:06:53.974 17:19:05 -- common/autotest_common.sh@1142 -- # return 0 00:06:53.974 17:19:05 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:53.974 17:19:05 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:53.974 17:19:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.974 17:19:05 -- common/autotest_common.sh@10 -- # set +x 00:06:54.234 ************************************ 00:06:54.234 START TEST dpdk_mem_utility 00:06:54.234 ************************************ 00:06:54.234 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:54.234 * Looking for test storage... 00:06:54.234 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:54.235 17:19:05 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:54.235 17:19:05 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2702927 00:06:54.235 17:19:05 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2702927 00:06:54.235 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 2702927 ']' 00:06:54.235 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:54.235 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:54.235 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:54.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:54.235 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:54.235 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:54.235 17:19:05 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:54.235 [2024-07-15 17:19:05.509402] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:54.235 [2024-07-15 17:19:05.509542] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2702927 ] 00:06:54.495 [2024-07-15 17:19:05.652896] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.495 [2024-07-15 17:19:05.729731] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.754 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:54.754 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:06:54.755 17:19:05 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:54.755 17:19:05 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:54.755 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.755 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:54.755 { 00:06:54.755 "filename": "/tmp/spdk_mem_dump.txt" 00:06:54.755 } 00:06:54.755 17:19:05 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.755 17:19:05 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:55.015 DPDK memory size 816.000000 MiB in 2 heap(s) 00:06:55.015 2 heaps totaling size 816.000000 MiB 00:06:55.015 size: 814.000000 MiB heap id: 0 00:06:55.015 size: 2.000000 MiB heap id: 1 00:06:55.015 end heaps---------- 00:06:55.015 8 mempools totaling size 598.116089 MiB 00:06:55.015 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:55.015 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:55.015 size: 84.521057 MiB name: bdev_io_2702927 00:06:55.015 size: 51.011292 MiB name: evtpool_2702927 00:06:55.015 size: 50.003479 MiB name: msgpool_2702927 00:06:55.015 size: 21.763794 MiB name: PDU_Pool 00:06:55.015 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:55.015 size: 0.026123 MiB name: Session_Pool 00:06:55.015 end mempools------- 00:06:55.015 201 memzones totaling size 4.176453 MiB 00:06:55.015 size: 1.000366 MiB name: RG_ring_0_2702927 00:06:55.015 size: 1.000366 MiB name: RG_ring_1_2702927 00:06:55.015 size: 1.000366 MiB name: RG_ring_4_2702927 00:06:55.015 size: 1.000366 MiB name: RG_ring_5_2702927 00:06:55.015 size: 0.125366 MiB name: RG_ring_2_2702927 00:06:55.015 size: 0.015991 MiB name: RG_ring_3_2702927 00:06:55.015 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:55.015 size: 0.000305 MiB name: 0000:cc:01.0_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:01.1_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:01.2_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:01.3_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:01.4_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:01.5_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:01.6_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:01.7_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:02.0_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:02.1_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:02.2_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:02.3_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:02.4_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:02.5_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:02.6_qat 00:06:55.015 size: 0.000305 MiB name: 0000:cc:02.7_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:01.0_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:01.1_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:01.2_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:01.3_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:01.4_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:01.5_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:01.6_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:01.7_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:02.0_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:02.1_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:02.2_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:02.3_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:02.4_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:02.5_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:02.6_qat 00:06:55.015 size: 0.000305 MiB name: 0000:ce:02.7_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:01.0_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:01.1_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:01.2_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:01.3_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:01.4_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:01.5_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:01.6_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:01.7_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:02.0_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:02.1_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:02.2_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:02.3_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:02.4_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:02.5_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:02.6_qat 00:06:55.015 size: 0.000305 MiB name: 0000:d0:02.7_qat 00:06:55.015 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:55.015 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:55.015 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:55.016 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:55.016 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:55.016 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:55.016 end memzones------- 00:06:55.016 17:19:06 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:55.278 heap id: 0 total size: 814.000000 MiB number of busy elements: 493 number of free elements: 14 00:06:55.278 list of free elements. size: 11.842896 MiB 00:06:55.278 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:55.278 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:55.278 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:55.278 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:55.278 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:55.278 element at address: 0x200007000000 with size: 0.991760 MiB 00:06:55.278 element at address: 0x200013800000 with size: 0.978882 MiB 00:06:55.278 element at address: 0x200019200000 with size: 0.937256 MiB 00:06:55.278 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:06:55.278 element at address: 0x200003a00000 with size: 0.498535 MiB 00:06:55.278 element at address: 0x20000b200000 with size: 0.491272 MiB 00:06:55.278 element at address: 0x200000800000 with size: 0.486145 MiB 00:06:55.278 element at address: 0x200019400000 with size: 0.485840 MiB 00:06:55.278 element at address: 0x200027e00000 with size: 0.399780 MiB 00:06:55.278 list of standard malloc elements. size: 199.872253 MiB 00:06:55.278 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:55.278 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:55.278 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:55.278 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:55.278 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:55.278 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:55.278 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:55.278 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:55.278 element at address: 0x20000033b340 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000033e8c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000341e40 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003453c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000348940 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000034bec0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000034f440 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003529c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000355f40 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003594c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000035ca40 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000035ffc0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000363540 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000366ac0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000036a040 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000036d5c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000370b40 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003740c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000377640 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000037abc0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000037e140 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003816c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000384c40 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003881c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000038b740 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000038ecc0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000392240 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003957c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000398d40 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000039c2c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x20000039f840 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003a2dc0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003a6340 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003a98c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003ace40 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003b03c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003b3940 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003b6ec0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003ba440 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003bd9c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003c0f40 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003c44c0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003c7a40 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003cafc0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003ce540 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003d1ac0 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003d5040 with size: 0.004395 MiB 00:06:55.278 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:55.278 element at address: 0x200000339240 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000033a2c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000033c7c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000033d840 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000033fd40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000340dc0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003432c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000344340 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000346840 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003478c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000349dc0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000034ae40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000034d340 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000034e3c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003508c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000351940 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000353e40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000354ec0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003573c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000358440 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000035a940 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000035b9c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000035dec0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000035ef40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000361440 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003624c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003649c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000365a40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000367f40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000368fc0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000036b4c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000036c540 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000036ea40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000036fac0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000371fc0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000373040 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000375540 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003765c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000378ac0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000379b40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000037c040 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000037d0c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000037f5c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000380640 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000382b40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000383bc0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003860c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000387140 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000389640 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000038a6c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000038cbc0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x20000038dc40 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000390140 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003911c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x2000003936c0 with size: 0.004028 MiB 00:06:55.278 element at address: 0x200000394740 with size: 0.004028 MiB 00:06:55.279 element at address: 0x200000396c40 with size: 0.004028 MiB 00:06:55.279 element at address: 0x200000397cc0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x20000039a1c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x20000039b240 with size: 0.004028 MiB 00:06:55.279 element at address: 0x20000039d740 with size: 0.004028 MiB 00:06:55.279 element at address: 0x20000039e7c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003a0cc0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003a1d40 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003a4240 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003a52c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003a77c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003a8840 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003aad40 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003abdc0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003ae2c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003af340 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003b1840 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003b28c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003b4dc0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003b5e40 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003b8340 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003b93c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003bb8c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003bc940 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003bee40 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003bfec0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003c23c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003c3440 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003c5940 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003c69c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003c8ec0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003c9f40 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003cc440 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003cd4c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003cf9c0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003d0a40 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003d2f40 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003d3fc0 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:55.279 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:55.279 element at address: 0x200000200000 with size: 0.000305 MiB 00:06:55.279 element at address: 0x20000020ea00 with size: 0.000305 MiB 00:06:55.279 element at address: 0x200000200140 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200200 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200380 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200440 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200500 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200680 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200740 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200800 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200980 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200a40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200b00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200c80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000200d40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209000 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002090c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209180 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209240 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209300 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002093c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209480 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209540 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209600 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002096c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209780 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209840 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209900 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002099c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209a80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209b40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209c00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209cc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209d80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209e40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209f00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000209fc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a080 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a140 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a200 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a2c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a380 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a440 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a500 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a5c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a680 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a740 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a800 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a8c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020a980 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020aa40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ab00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020abc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ac80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ad40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ae00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020aec0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020af80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b040 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b100 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b1c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b280 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b340 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b400 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b4c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b580 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b640 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b700 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b7c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b880 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020b940 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ba00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020bac0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020bb80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020bc40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020bd00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020bdc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020be80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020bf40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c000 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c0c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c180 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c240 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c300 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c3c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c480 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c540 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c600 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c6c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c780 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c840 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c900 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020c9c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ca80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020cb40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020cc00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ccc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020cd80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ce40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020cf00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020cfc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d080 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d140 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d200 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d2c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d380 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d440 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d500 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d5c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d680 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d740 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d800 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d8c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020d980 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020da40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020db00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020dbc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020dc80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020dd40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020de00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020dec0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020df80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e040 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e100 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e1c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e280 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e340 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e400 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e4c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e580 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e640 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e700 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e7c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e880 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020e940 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020eb40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ec00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ecc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ed80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ee40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ef00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020efc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f080 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f140 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f200 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f2c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f380 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f440 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f500 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f5c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f680 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f740 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f800 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f8c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020f980 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020fa40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020fb00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020fbc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020fc80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020fd40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020fe00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020fec0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x20000020ff80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210040 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210100 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002101c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210280 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210340 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210400 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002104c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210580 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210640 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210700 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002107c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210880 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210940 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210a00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000210c00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000214ec0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235180 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235240 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235300 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002353c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235480 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235540 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235600 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002356c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235780 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235840 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235900 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002359c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235a80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235b40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235c00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235cc0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235d80 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235e40 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000235f00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236100 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002361c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236280 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236340 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236400 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002364c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236580 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236640 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236700 with size: 0.000183 MiB 00:06:55.279 element at address: 0x2000002367c0 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236880 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236940 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236a00 with size: 0.000183 MiB 00:06:55.279 element at address: 0x200000236ac0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000236b80 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000236c40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000236d00 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000338f00 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000338fc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000033c540 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000033fac0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000343040 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003465c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000349b40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000034d0c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000350640 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000353bc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000357140 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000035a6c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000035dc40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003611c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000364740 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000367cc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000036b240 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000036e7c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000371d40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003752c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000378840 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000037bdc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000037f340 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003828c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000385e40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003893c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000038c940 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000038fec0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000393440 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003969c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200000399f40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000039d4c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003a0a40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003a3fc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003a7540 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003aaac0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003ae040 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003b4b40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003b80c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003bb640 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003bebc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003c2140 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003c56c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003c8c40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003cc1c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003cf740 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003d2cc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000003d6840 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087c740 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087c800 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:55.280 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e66580 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e66640 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6d240 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:55.280 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:55.280 list of memzone associated elements. size: 602.284851 MiB 00:06:55.280 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:55.280 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:55.280 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:55.280 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:55.280 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:55.280 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2702927_0 00:06:55.280 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:55.280 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2702927_0 00:06:55.280 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:55.280 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2702927_0 00:06:55.280 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:55.280 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:55.280 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:55.280 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:55.280 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:55.280 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2702927 00:06:55.280 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:55.280 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2702927 00:06:55.280 element at address: 0x200000236dc0 with size: 1.008118 MiB 00:06:55.280 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2702927 00:06:55.280 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:55.280 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:55.280 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:55.280 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:55.280 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:55.280 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:55.280 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:55.280 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:55.280 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:55.280 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2702927 00:06:55.280 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:55.280 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2702927 00:06:55.280 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:55.280 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2702927 00:06:55.280 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:55.280 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2702927 00:06:55.280 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:06:55.280 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2702927 00:06:55.280 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:06:55.280 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:55.280 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:55.280 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:55.280 element at address: 0x20001947c600 with size: 0.250488 MiB 00:06:55.280 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:55.280 element at address: 0x200000214f80 with size: 0.125488 MiB 00:06:55.280 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2702927 00:06:55.280 element at address: 0x200000200e00 with size: 0.031738 MiB 00:06:55.280 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:55.280 element at address: 0x200027e66700 with size: 0.023743 MiB 00:06:55.280 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:55.280 element at address: 0x200000210cc0 with size: 0.016113 MiB 00:06:55.280 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2702927 00:06:55.280 element at address: 0x200027e6c840 with size: 0.002441 MiB 00:06:55.280 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:55.280 element at address: 0x2000003d6300 with size: 0.001282 MiB 00:06:55.280 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:55.280 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.0_qat 00:06:55.280 element at address: 0x2000003d2d80 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.1_qat 00:06:55.280 element at address: 0x2000003cf800 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.2_qat 00:06:55.280 element at address: 0x2000003cc280 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.3_qat 00:06:55.280 element at address: 0x2000003c8d00 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.4_qat 00:06:55.280 element at address: 0x2000003c5780 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.5_qat 00:06:55.280 element at address: 0x2000003c2200 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.6_qat 00:06:55.280 element at address: 0x2000003bec80 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.7_qat 00:06:55.280 element at address: 0x2000003bb700 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.0_qat 00:06:55.280 element at address: 0x2000003b8180 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.1_qat 00:06:55.280 element at address: 0x2000003b4c00 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.2_qat 00:06:55.280 element at address: 0x2000003b1680 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.3_qat 00:06:55.280 element at address: 0x2000003ae100 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.4_qat 00:06:55.280 element at address: 0x2000003aab80 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.5_qat 00:06:55.280 element at address: 0x2000003a7600 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.6_qat 00:06:55.280 element at address: 0x2000003a4080 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.7_qat 00:06:55.280 element at address: 0x2000003a0b00 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.0_qat 00:06:55.280 element at address: 0x20000039d580 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.1_qat 00:06:55.280 element at address: 0x20000039a000 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.2_qat 00:06:55.280 element at address: 0x200000396a80 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.3_qat 00:06:55.280 element at address: 0x200000393500 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.4_qat 00:06:55.280 element at address: 0x20000038ff80 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.5_qat 00:06:55.280 element at address: 0x20000038ca00 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.6_qat 00:06:55.280 element at address: 0x200000389480 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.7_qat 00:06:55.280 element at address: 0x200000385f00 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.0_qat 00:06:55.280 element at address: 0x200000382980 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.1_qat 00:06:55.280 element at address: 0x20000037f400 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.2_qat 00:06:55.280 element at address: 0x20000037be80 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.3_qat 00:06:55.280 element at address: 0x200000378900 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.4_qat 00:06:55.280 element at address: 0x200000375380 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.5_qat 00:06:55.280 element at address: 0x200000371e00 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.6_qat 00:06:55.280 element at address: 0x20000036e880 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.7_qat 00:06:55.280 element at address: 0x20000036b300 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.0_qat 00:06:55.280 element at address: 0x200000367d80 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.1_qat 00:06:55.280 element at address: 0x200000364800 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.2_qat 00:06:55.280 element at address: 0x200000361280 with size: 0.000427 MiB 00:06:55.280 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.3_qat 00:06:55.281 element at address: 0x20000035dd00 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.4_qat 00:06:55.281 element at address: 0x20000035a780 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.5_qat 00:06:55.281 element at address: 0x200000357200 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.6_qat 00:06:55.281 element at address: 0x200000353c80 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.7_qat 00:06:55.281 element at address: 0x200000350700 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.0_qat 00:06:55.281 element at address: 0x20000034d180 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.1_qat 00:06:55.281 element at address: 0x200000349c00 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.2_qat 00:06:55.281 element at address: 0x200000346680 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.3_qat 00:06:55.281 element at address: 0x200000343100 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.4_qat 00:06:55.281 element at address: 0x20000033fb80 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.5_qat 00:06:55.281 element at address: 0x20000033c600 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.6_qat 00:06:55.281 element at address: 0x200000339080 with size: 0.000427 MiB 00:06:55.281 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.7_qat 00:06:55.281 element at address: 0x2000003d6900 with size: 0.000305 MiB 00:06:55.281 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:55.281 element at address: 0x200000235fc0 with size: 0.000305 MiB 00:06:55.281 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2702927 00:06:55.281 element at address: 0x200000210ac0 with size: 0.000305 MiB 00:06:55.281 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2702927 00:06:55.281 element at address: 0x200027e6d300 with size: 0.000305 MiB 00:06:55.281 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:55.281 element at address: 0x2000003d6240 with size: 0.000183 MiB 00:06:55.281 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:55.281 17:19:06 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:55.281 17:19:06 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2702927 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 2702927 ']' 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 2702927 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2702927 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2702927' 00:06:55.281 killing process with pid 2702927 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 2702927 00:06:55.281 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 2702927 00:06:55.541 00:06:55.541 real 0m1.390s 00:06:55.541 user 0m1.941s 00:06:55.541 sys 0m0.483s 00:06:55.541 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:55.541 17:19:06 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:55.541 ************************************ 00:06:55.541 END TEST dpdk_mem_utility 00:06:55.541 ************************************ 00:06:55.541 17:19:06 -- common/autotest_common.sh@1142 -- # return 0 00:06:55.541 17:19:06 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:55.541 17:19:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:55.541 17:19:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:55.541 17:19:06 -- common/autotest_common.sh@10 -- # set +x 00:06:55.541 ************************************ 00:06:55.541 START TEST event 00:06:55.541 ************************************ 00:06:55.541 17:19:06 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:55.800 * Looking for test storage... 00:06:55.800 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:55.800 17:19:06 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:55.800 17:19:06 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:55.801 17:19:06 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:55.801 17:19:06 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:55.801 17:19:06 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:55.801 17:19:06 event -- common/autotest_common.sh@10 -- # set +x 00:06:55.801 ************************************ 00:06:55.801 START TEST event_perf 00:06:55.801 ************************************ 00:06:55.801 17:19:06 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:55.801 Running I/O for 1 seconds...[2024-07-15 17:19:06.924164] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:55.801 [2024-07-15 17:19:06.924244] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2703281 ] 00:06:55.801 [2024-07-15 17:19:07.019583] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:55.801 [2024-07-15 17:19:07.090095] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:55.801 [2024-07-15 17:19:07.090240] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:55.801 [2024-07-15 17:19:07.090384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.801 Running I/O for 1 seconds...[2024-07-15 17:19:07.090385] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:57.179 00:06:57.179 lcore 0: 77509 00:06:57.179 lcore 1: 77512 00:06:57.179 lcore 2: 77516 00:06:57.179 lcore 3: 77513 00:06:57.179 done. 00:06:57.179 00:06:57.179 real 0m1.245s 00:06:57.179 user 0m4.134s 00:06:57.179 sys 0m0.105s 00:06:57.179 17:19:08 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:57.179 17:19:08 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:57.179 ************************************ 00:06:57.179 END TEST event_perf 00:06:57.179 ************************************ 00:06:57.179 17:19:08 event -- common/autotest_common.sh@1142 -- # return 0 00:06:57.179 17:19:08 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:57.179 17:19:08 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:57.179 17:19:08 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:57.179 17:19:08 event -- common/autotest_common.sh@10 -- # set +x 00:06:57.179 ************************************ 00:06:57.179 START TEST event_reactor 00:06:57.179 ************************************ 00:06:57.179 17:19:08 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:57.179 [2024-07-15 17:19:08.248056] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:57.179 [2024-07-15 17:19:08.248147] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2703409 ] 00:06:57.179 [2024-07-15 17:19:08.336677] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.179 [2024-07-15 17:19:08.400477] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.640 test_start 00:06:58.640 oneshot 00:06:58.640 tick 100 00:06:58.640 tick 100 00:06:58.640 tick 250 00:06:58.640 tick 100 00:06:58.640 tick 100 00:06:58.640 tick 250 00:06:58.640 tick 100 00:06:58.640 tick 500 00:06:58.640 tick 100 00:06:58.640 tick 100 00:06:58.640 tick 250 00:06:58.640 tick 100 00:06:58.640 tick 100 00:06:58.640 test_end 00:06:58.640 00:06:58.640 real 0m1.231s 00:06:58.640 user 0m1.138s 00:06:58.640 sys 0m0.089s 00:06:58.640 17:19:09 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.640 17:19:09 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:58.640 ************************************ 00:06:58.640 END TEST event_reactor 00:06:58.640 ************************************ 00:06:58.640 17:19:09 event -- common/autotest_common.sh@1142 -- # return 0 00:06:58.640 17:19:09 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:58.640 17:19:09 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:58.640 17:19:09 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.640 17:19:09 event -- common/autotest_common.sh@10 -- # set +x 00:06:58.640 ************************************ 00:06:58.640 START TEST event_reactor_perf 00:06:58.640 ************************************ 00:06:58.640 17:19:09 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:58.640 [2024-07-15 17:19:09.554086] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:58.640 [2024-07-15 17:19:09.554166] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2703647 ] 00:06:58.640 [2024-07-15 17:19:09.644412] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.640 [2024-07-15 17:19:09.712101] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.580 test_start 00:06:59.580 test_end 00:06:59.580 Performance: 401606 events per second 00:06:59.580 00:06:59.580 real 0m1.236s 00:06:59.580 user 0m1.139s 00:06:59.580 sys 0m0.093s 00:06:59.580 17:19:10 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:59.580 17:19:10 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:59.580 ************************************ 00:06:59.580 END TEST event_reactor_perf 00:06:59.580 ************************************ 00:06:59.580 17:19:10 event -- common/autotest_common.sh@1142 -- # return 0 00:06:59.580 17:19:10 event -- event/event.sh@49 -- # uname -s 00:06:59.580 17:19:10 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:59.580 17:19:10 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:59.580 17:19:10 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:59.580 17:19:10 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:59.580 17:19:10 event -- common/autotest_common.sh@10 -- # set +x 00:06:59.580 ************************************ 00:06:59.580 START TEST event_scheduler 00:06:59.580 ************************************ 00:06:59.580 17:19:10 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:59.840 * Looking for test storage... 00:06:59.840 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:59.840 17:19:10 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:59.840 17:19:10 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=2703990 00:06:59.840 17:19:10 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:59.840 17:19:10 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 2703990 00:06:59.840 17:19:10 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 2703990 ']' 00:06:59.840 17:19:10 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:59.840 17:19:10 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:59.840 17:19:10 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:59.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:59.840 17:19:10 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:59.840 17:19:10 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:59.840 17:19:10 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:59.840 [2024-07-15 17:19:11.040206] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:06:59.840 [2024-07-15 17:19:11.040337] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2703990 ] 00:07:00.100 [2024-07-15 17:19:11.242975] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:00.360 [2024-07-15 17:19:11.411982] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.360 [2024-07-15 17:19:11.412149] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:00.360 [2024-07-15 17:19:11.412379] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:00.360 [2024-07-15 17:19:11.412530] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:00.620 17:19:11 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:00.620 17:19:11 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:07:00.620 17:19:11 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:00.620 17:19:11 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.620 17:19:11 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:00.620 [2024-07-15 17:19:11.855065] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:00.620 [2024-07-15 17:19:11.855111] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:00.620 [2024-07-15 17:19:11.855137] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:00.620 [2024-07-15 17:19:11.855153] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:00.620 [2024-07-15 17:19:11.855169] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:00.620 17:19:11 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.620 17:19:11 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:00.620 17:19:11 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.620 17:19:11 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 [2024-07-15 17:19:11.970338] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:00.881 17:19:11 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.881 17:19:11 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:00.881 17:19:11 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:00.881 17:19:11 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.881 17:19:11 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 ************************************ 00:07:00.881 START TEST scheduler_create_thread 00:07:00.881 ************************************ 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 2 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 3 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 4 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 5 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 6 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 7 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 8 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.881 9 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:00.881 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:01.451 10 00:07:01.451 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:01.451 17:19:12 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:01.451 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:01.451 17:19:12 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:02.833 17:19:13 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:02.833 17:19:13 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:02.833 17:19:13 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:02.833 17:19:13 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:02.833 17:19:13 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.774 17:19:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:03.774 17:19:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:03.774 17:19:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:03.774 17:19:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:04.343 17:19:15 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:04.343 17:19:15 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:04.343 17:19:15 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:04.343 17:19:15 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:04.343 17:19:15 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:05.283 17:19:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:05.283 00:07:05.283 real 0m4.227s 00:07:05.283 user 0m0.026s 00:07:05.283 sys 0m0.005s 00:07:05.283 17:19:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:05.283 17:19:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:05.283 ************************************ 00:07:05.283 END TEST scheduler_create_thread 00:07:05.283 ************************************ 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:07:05.283 17:19:16 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:05.283 17:19:16 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 2703990 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 2703990 ']' 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 2703990 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2703990 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2703990' 00:07:05.283 killing process with pid 2703990 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 2703990 00:07:05.283 17:19:16 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 2703990 00:07:05.283 [2024-07-15 17:19:16.519058] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:05.543 00:07:05.543 real 0m5.988s 00:07:05.543 user 0m12.376s 00:07:05.543 sys 0m0.504s 00:07:05.543 17:19:16 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:05.543 17:19:16 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:05.543 ************************************ 00:07:05.543 END TEST event_scheduler 00:07:05.543 ************************************ 00:07:05.804 17:19:16 event -- common/autotest_common.sh@1142 -- # return 0 00:07:05.804 17:19:16 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:05.804 17:19:16 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:05.804 17:19:16 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:05.804 17:19:16 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:05.804 17:19:16 event -- common/autotest_common.sh@10 -- # set +x 00:07:05.804 ************************************ 00:07:05.804 START TEST app_repeat 00:07:05.804 ************************************ 00:07:05.804 17:19:16 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@19 -- # repeat_pid=2704984 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2704984' 00:07:05.804 Process app_repeat pid: 2704984 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:05.804 spdk_app_start Round 0 00:07:05.804 17:19:16 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2704984 /var/tmp/spdk-nbd.sock 00:07:05.804 17:19:16 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2704984 ']' 00:07:05.804 17:19:16 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:05.804 17:19:16 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:05.804 17:19:16 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:05.804 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:05.804 17:19:16 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:05.804 17:19:16 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:05.804 [2024-07-15 17:19:16.967878] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:05.804 [2024-07-15 17:19:16.967936] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2704984 ] 00:07:05.804 [2024-07-15 17:19:17.061446] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:06.064 [2024-07-15 17:19:17.136598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:06.064 [2024-07-15 17:19:17.136603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.634 17:19:17 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:06.634 17:19:17 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:06.634 17:19:17 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:06.893 Malloc0 00:07:06.893 17:19:18 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:07.153 Malloc1 00:07:07.153 17:19:18 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:07.153 17:19:18 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:07.413 /dev/nbd0 00:07:07.413 17:19:18 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:07.413 17:19:18 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:07.413 17:19:18 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:07.413 17:19:18 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:07.414 1+0 records in 00:07:07.414 1+0 records out 00:07:07.414 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284856 s, 14.4 MB/s 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:07.414 17:19:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:07.414 17:19:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:07.414 17:19:18 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:07.414 /dev/nbd1 00:07:07.414 17:19:18 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:07.414 17:19:18 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:07.414 1+0 records in 00:07:07.414 1+0 records out 00:07:07.414 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000316481 s, 12.9 MB/s 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:07.414 17:19:18 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:07.674 17:19:18 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:07.674 17:19:18 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:07.674 17:19:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:07.674 17:19:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:07.674 17:19:18 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:07.674 17:19:18 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.674 17:19:18 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:07.674 17:19:18 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:07.674 { 00:07:07.674 "nbd_device": "/dev/nbd0", 00:07:07.674 "bdev_name": "Malloc0" 00:07:07.674 }, 00:07:07.674 { 00:07:07.674 "nbd_device": "/dev/nbd1", 00:07:07.674 "bdev_name": "Malloc1" 00:07:07.674 } 00:07:07.674 ]' 00:07:07.674 17:19:18 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:07.674 { 00:07:07.674 "nbd_device": "/dev/nbd0", 00:07:07.674 "bdev_name": "Malloc0" 00:07:07.674 }, 00:07:07.674 { 00:07:07.674 "nbd_device": "/dev/nbd1", 00:07:07.674 "bdev_name": "Malloc1" 00:07:07.674 } 00:07:07.674 ]' 00:07:07.674 17:19:18 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:07.936 /dev/nbd1' 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:07.936 /dev/nbd1' 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:07.936 256+0 records in 00:07:07.936 256+0 records out 00:07:07.936 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0117518 s, 89.2 MB/s 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:07.936 17:19:18 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:07.936 256+0 records in 00:07:07.936 256+0 records out 00:07:07.936 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0151156 s, 69.4 MB/s 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:07.936 256+0 records in 00:07:07.936 256+0 records out 00:07:07.936 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0163342 s, 64.2 MB/s 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:07.936 17:19:19 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:08.197 17:19:19 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:08.197 17:19:19 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:08.197 17:19:19 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:08.197 17:19:19 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:08.198 17:19:19 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:08.198 17:19:19 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:08.198 17:19:19 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:08.198 17:19:19 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:08.198 17:19:19 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:08.198 17:19:19 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:08.198 17:19:19 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:08.458 17:19:19 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:08.458 17:19:19 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:08.719 17:19:19 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:08.980 [2024-07-15 17:19:20.068651] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:08.980 [2024-07-15 17:19:20.131271] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:08.980 [2024-07-15 17:19:20.131275] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.980 [2024-07-15 17:19:20.161982] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:08.980 [2024-07-15 17:19:20.162019] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:12.282 17:19:22 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:12.282 17:19:22 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:12.282 spdk_app_start Round 1 00:07:12.282 17:19:22 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2704984 /var/tmp/spdk-nbd.sock 00:07:12.282 17:19:22 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2704984 ']' 00:07:12.282 17:19:22 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:12.282 17:19:22 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:12.282 17:19:22 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:12.282 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:12.282 17:19:22 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:12.282 17:19:22 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:12.282 17:19:23 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:12.282 17:19:23 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:12.282 17:19:23 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:12.282 Malloc0 00:07:12.282 17:19:23 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:12.282 Malloc1 00:07:12.542 17:19:23 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:12.542 /dev/nbd0 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:12.542 1+0 records in 00:07:12.542 1+0 records out 00:07:12.542 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294015 s, 13.9 MB/s 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:12.542 17:19:23 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:12.542 17:19:23 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:12.803 /dev/nbd1 00:07:12.803 17:19:24 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:12.803 17:19:24 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:12.803 1+0 records in 00:07:12.803 1+0 records out 00:07:12.803 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268302 s, 15.3 MB/s 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:12.803 17:19:24 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:12.803 17:19:24 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:12.803 17:19:24 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:12.803 17:19:24 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:12.803 17:19:24 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:12.803 17:19:24 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:13.063 { 00:07:13.063 "nbd_device": "/dev/nbd0", 00:07:13.063 "bdev_name": "Malloc0" 00:07:13.063 }, 00:07:13.063 { 00:07:13.063 "nbd_device": "/dev/nbd1", 00:07:13.063 "bdev_name": "Malloc1" 00:07:13.063 } 00:07:13.063 ]' 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:13.063 { 00:07:13.063 "nbd_device": "/dev/nbd0", 00:07:13.063 "bdev_name": "Malloc0" 00:07:13.063 }, 00:07:13.063 { 00:07:13.063 "nbd_device": "/dev/nbd1", 00:07:13.063 "bdev_name": "Malloc1" 00:07:13.063 } 00:07:13.063 ]' 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:13.063 /dev/nbd1' 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:13.063 /dev/nbd1' 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:13.063 256+0 records in 00:07:13.063 256+0 records out 00:07:13.063 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0118343 s, 88.6 MB/s 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:13.063 256+0 records in 00:07:13.063 256+0 records out 00:07:13.063 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0149084 s, 70.3 MB/s 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:13.063 17:19:24 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:13.323 256+0 records in 00:07:13.323 256+0 records out 00:07:13.323 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0160881 s, 65.2 MB/s 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:13.323 17:19:24 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:13.324 17:19:24 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:13.324 17:19:24 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:13.324 17:19:24 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:13.324 17:19:24 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:13.324 17:19:24 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:13.324 17:19:24 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:13.324 17:19:24 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:13.324 17:19:24 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:13.324 17:19:24 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.584 17:19:24 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:13.843 17:19:25 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:13.843 17:19:25 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:14.104 17:19:25 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:14.364 [2024-07-15 17:19:25.433218] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:14.364 [2024-07-15 17:19:25.495842] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:14.364 [2024-07-15 17:19:25.495848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.364 [2024-07-15 17:19:25.527372] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:14.364 [2024-07-15 17:19:25.527407] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:17.689 17:19:28 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:17.689 17:19:28 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:17.689 spdk_app_start Round 2 00:07:17.689 17:19:28 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2704984 /var/tmp/spdk-nbd.sock 00:07:17.689 17:19:28 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2704984 ']' 00:07:17.689 17:19:28 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:17.689 17:19:28 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:17.689 17:19:28 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:17.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:17.689 17:19:28 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:17.689 17:19:28 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:17.689 17:19:28 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:17.689 17:19:28 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:17.689 17:19:28 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:17.689 Malloc0 00:07:17.689 17:19:28 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:17.689 Malloc1 00:07:17.689 17:19:28 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:17.689 17:19:28 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:17.949 /dev/nbd0 00:07:17.949 17:19:29 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:17.949 17:19:29 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:17.949 1+0 records in 00:07:17.949 1+0 records out 00:07:17.949 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026804 s, 15.3 MB/s 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:17.949 17:19:29 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:17.949 17:19:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:17.949 17:19:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:17.949 17:19:29 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:18.210 /dev/nbd1 00:07:18.210 17:19:29 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:18.210 17:19:29 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:18.210 1+0 records in 00:07:18.210 1+0 records out 00:07:18.210 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291401 s, 14.1 MB/s 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:18.210 17:19:29 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:18.210 17:19:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:18.210 17:19:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:18.210 17:19:29 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:18.210 17:19:29 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.210 17:19:29 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:18.471 { 00:07:18.471 "nbd_device": "/dev/nbd0", 00:07:18.471 "bdev_name": "Malloc0" 00:07:18.471 }, 00:07:18.471 { 00:07:18.471 "nbd_device": "/dev/nbd1", 00:07:18.471 "bdev_name": "Malloc1" 00:07:18.471 } 00:07:18.471 ]' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:18.471 { 00:07:18.471 "nbd_device": "/dev/nbd0", 00:07:18.471 "bdev_name": "Malloc0" 00:07:18.471 }, 00:07:18.471 { 00:07:18.471 "nbd_device": "/dev/nbd1", 00:07:18.471 "bdev_name": "Malloc1" 00:07:18.471 } 00:07:18.471 ]' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:18.471 /dev/nbd1' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:18.471 /dev/nbd1' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:18.471 256+0 records in 00:07:18.471 256+0 records out 00:07:18.471 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125253 s, 83.7 MB/s 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:18.471 256+0 records in 00:07:18.471 256+0 records out 00:07:18.471 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0151468 s, 69.2 MB/s 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:18.471 256+0 records in 00:07:18.471 256+0 records out 00:07:18.471 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0162481 s, 64.5 MB/s 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:18.471 17:19:29 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:18.732 17:19:29 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.993 17:19:30 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:19.254 17:19:30 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:19.254 17:19:30 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:19.514 17:19:30 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:19.514 [2024-07-15 17:19:30.789642] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:19.774 [2024-07-15 17:19:30.851130] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:19.774 [2024-07-15 17:19:30.851135] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.774 [2024-07-15 17:19:30.881837] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:19.774 [2024-07-15 17:19:30.881872] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:23.073 17:19:33 event.app_repeat -- event/event.sh@38 -- # waitforlisten 2704984 /var/tmp/spdk-nbd.sock 00:07:23.073 17:19:33 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2704984 ']' 00:07:23.073 17:19:33 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:23.073 17:19:33 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:23.073 17:19:33 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:23.073 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:23.073 17:19:33 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:23.073 17:19:33 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:23.073 17:19:33 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:23.073 17:19:33 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:23.073 17:19:33 event.app_repeat -- event/event.sh@39 -- # killprocess 2704984 00:07:23.073 17:19:33 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 2704984 ']' 00:07:23.074 17:19:33 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 2704984 00:07:23.074 17:19:33 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:23.074 17:19:33 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:23.074 17:19:33 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2704984 00:07:23.074 17:19:33 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:23.074 17:19:33 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:23.074 17:19:33 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2704984' 00:07:23.074 killing process with pid 2704984 00:07:23.074 17:19:33 event.app_repeat -- common/autotest_common.sh@967 -- # kill 2704984 00:07:23.074 17:19:33 event.app_repeat -- common/autotest_common.sh@972 -- # wait 2704984 00:07:23.074 spdk_app_start is called in Round 0. 00:07:23.074 Shutdown signal received, stop current app iteration 00:07:23.074 Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 reinitialization... 00:07:23.074 spdk_app_start is called in Round 1. 00:07:23.074 Shutdown signal received, stop current app iteration 00:07:23.074 Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 reinitialization... 00:07:23.074 spdk_app_start is called in Round 2. 00:07:23.074 Shutdown signal received, stop current app iteration 00:07:23.074 Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 reinitialization... 00:07:23.074 spdk_app_start is called in Round 3. 00:07:23.074 Shutdown signal received, stop current app iteration 00:07:23.074 17:19:34 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:23.074 17:19:34 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:23.074 00:07:23.074 real 0m17.129s 00:07:23.074 user 0m38.034s 00:07:23.074 sys 0m2.432s 00:07:23.074 17:19:34 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.074 17:19:34 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:23.074 ************************************ 00:07:23.074 END TEST app_repeat 00:07:23.074 ************************************ 00:07:23.074 17:19:34 event -- common/autotest_common.sh@1142 -- # return 0 00:07:23.074 17:19:34 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:23.074 00:07:23.074 real 0m27.335s 00:07:23.074 user 0m57.027s 00:07:23.074 sys 0m3.551s 00:07:23.074 17:19:34 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.074 17:19:34 event -- common/autotest_common.sh@10 -- # set +x 00:07:23.074 ************************************ 00:07:23.074 END TEST event 00:07:23.074 ************************************ 00:07:23.074 17:19:34 -- common/autotest_common.sh@1142 -- # return 0 00:07:23.074 17:19:34 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:23.074 17:19:34 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:23.074 17:19:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.074 17:19:34 -- common/autotest_common.sh@10 -- # set +x 00:07:23.074 ************************************ 00:07:23.074 START TEST thread 00:07:23.074 ************************************ 00:07:23.074 17:19:34 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:23.074 * Looking for test storage... 00:07:23.074 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:23.074 17:19:34 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:23.074 17:19:34 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:23.074 17:19:34 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.074 17:19:34 thread -- common/autotest_common.sh@10 -- # set +x 00:07:23.074 ************************************ 00:07:23.074 START TEST thread_poller_perf 00:07:23.074 ************************************ 00:07:23.074 17:19:34 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:23.074 [2024-07-15 17:19:34.330518] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:23.074 [2024-07-15 17:19:34.330594] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2708254 ] 00:07:23.333 [2024-07-15 17:19:34.424703] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.333 [2024-07-15 17:19:34.500500] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.333 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:24.271 ====================================== 00:07:24.271 busy:2614753588 (cyc) 00:07:24.271 total_run_count: 312000 00:07:24.271 tsc_hz: 2600000000 (cyc) 00:07:24.271 ====================================== 00:07:24.271 poller_cost: 8380 (cyc), 3223 (nsec) 00:07:24.271 00:07:24.271 real 0m1.256s 00:07:24.271 user 0m1.152s 00:07:24.271 sys 0m0.099s 00:07:24.271 17:19:35 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:24.271 17:19:35 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:24.271 ************************************ 00:07:24.271 END TEST thread_poller_perf 00:07:24.271 ************************************ 00:07:24.530 17:19:35 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:24.530 17:19:35 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:24.530 17:19:35 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:24.530 17:19:35 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:24.530 17:19:35 thread -- common/autotest_common.sh@10 -- # set +x 00:07:24.530 ************************************ 00:07:24.530 START TEST thread_poller_perf 00:07:24.530 ************************************ 00:07:24.530 17:19:35 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:24.530 [2024-07-15 17:19:35.646802] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:24.530 [2024-07-15 17:19:35.646852] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2708518 ] 00:07:24.530 [2024-07-15 17:19:35.738652] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.530 [2024-07-15 17:19:35.814835] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.530 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:25.909 ====================================== 00:07:25.909 busy:2602354400 (cyc) 00:07:25.909 total_run_count: 4129000 00:07:25.909 tsc_hz: 2600000000 (cyc) 00:07:25.909 ====================================== 00:07:25.909 poller_cost: 630 (cyc), 242 (nsec) 00:07:25.909 00:07:25.909 real 0m1.232s 00:07:25.909 user 0m1.139s 00:07:25.909 sys 0m0.090s 00:07:25.909 17:19:36 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:25.909 17:19:36 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:25.909 ************************************ 00:07:25.909 END TEST thread_poller_perf 00:07:25.909 ************************************ 00:07:25.909 17:19:36 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:25.909 17:19:36 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:25.909 00:07:25.909 real 0m2.732s 00:07:25.909 user 0m2.403s 00:07:25.909 sys 0m0.334s 00:07:25.909 17:19:36 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:25.909 17:19:36 thread -- common/autotest_common.sh@10 -- # set +x 00:07:25.909 ************************************ 00:07:25.909 END TEST thread 00:07:25.909 ************************************ 00:07:25.909 17:19:36 -- common/autotest_common.sh@1142 -- # return 0 00:07:25.909 17:19:36 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:25.909 17:19:36 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:25.909 17:19:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.909 17:19:36 -- common/autotest_common.sh@10 -- # set +x 00:07:25.909 ************************************ 00:07:25.909 START TEST accel 00:07:25.909 ************************************ 00:07:25.909 17:19:36 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:25.909 * Looking for test storage... 00:07:25.909 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:25.909 17:19:37 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:25.909 17:19:37 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:25.909 17:19:37 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:25.909 17:19:37 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2708687 00:07:25.909 17:19:37 accel -- accel/accel.sh@63 -- # waitforlisten 2708687 00:07:25.909 17:19:37 accel -- common/autotest_common.sh@829 -- # '[' -z 2708687 ']' 00:07:25.909 17:19:37 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.909 17:19:37 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:25.909 17:19:37 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.909 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.909 17:19:37 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:25.909 17:19:37 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:25.909 17:19:37 accel -- common/autotest_common.sh@10 -- # set +x 00:07:25.909 17:19:37 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:25.909 17:19:37 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:25.909 17:19:37 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:25.909 17:19:37 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.909 17:19:37 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.909 17:19:37 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:25.909 17:19:37 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:25.909 17:19:37 accel -- accel/accel.sh@41 -- # jq -r . 00:07:25.909 [2024-07-15 17:19:37.144839] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:25.909 [2024-07-15 17:19:37.144914] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2708687 ] 00:07:26.180 [2024-07-15 17:19:37.235094] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.180 [2024-07-15 17:19:37.303812] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.752 17:19:37 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:26.752 17:19:37 accel -- common/autotest_common.sh@862 -- # return 0 00:07:26.752 17:19:37 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:26.752 17:19:37 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:26.752 17:19:37 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:26.752 17:19:37 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:26.752 17:19:37 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:26.752 17:19:37 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:26.752 17:19:37 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:26.752 17:19:37 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:26.752 17:19:37 accel -- common/autotest_common.sh@10 -- # set +x 00:07:26.752 17:19:37 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # IFS== 00:07:26.752 17:19:38 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:26.752 17:19:38 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:26.752 17:19:38 accel -- accel/accel.sh@75 -- # killprocess 2708687 00:07:26.752 17:19:38 accel -- common/autotest_common.sh@948 -- # '[' -z 2708687 ']' 00:07:26.752 17:19:38 accel -- common/autotest_common.sh@952 -- # kill -0 2708687 00:07:26.752 17:19:38 accel -- common/autotest_common.sh@953 -- # uname 00:07:26.752 17:19:38 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:26.752 17:19:38 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2708687 00:07:27.012 17:19:38 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:27.012 17:19:38 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:27.012 17:19:38 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2708687' 00:07:27.012 killing process with pid 2708687 00:07:27.012 17:19:38 accel -- common/autotest_common.sh@967 -- # kill 2708687 00:07:27.012 17:19:38 accel -- common/autotest_common.sh@972 -- # wait 2708687 00:07:27.012 17:19:38 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:27.012 17:19:38 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:27.012 17:19:38 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:27.012 17:19:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.012 17:19:38 accel -- common/autotest_common.sh@10 -- # set +x 00:07:27.272 17:19:38 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:07:27.272 17:19:38 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:27.272 17:19:38 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:27.272 17:19:38 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:27.272 17:19:38 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:27.272 17:19:38 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.272 17:19:38 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.272 17:19:38 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:27.272 17:19:38 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:27.272 17:19:38 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:27.272 17:19:38 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:27.272 17:19:38 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:27.272 17:19:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:27.272 17:19:38 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:27.272 17:19:38 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:27.272 17:19:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.272 17:19:38 accel -- common/autotest_common.sh@10 -- # set +x 00:07:27.272 ************************************ 00:07:27.272 START TEST accel_missing_filename 00:07:27.272 ************************************ 00:07:27.272 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:07:27.272 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:27.272 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:27.272 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:27.272 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:27.272 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:27.272 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:27.272 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:27.272 17:19:38 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:27.272 17:19:38 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:27.272 17:19:38 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:27.272 17:19:38 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:27.272 17:19:38 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.272 17:19:38 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.272 17:19:38 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:27.273 17:19:38 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:27.273 17:19:38 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:27.273 [2024-07-15 17:19:38.469998] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:27.273 [2024-07-15 17:19:38.470123] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2708988 ] 00:07:27.532 [2024-07-15 17:19:38.614370] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.532 [2024-07-15 17:19:38.689586] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.533 [2024-07-15 17:19:38.737912] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:27.533 [2024-07-15 17:19:38.775001] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:07:27.533 A filename is required. 00:07:27.533 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:27.533 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:27.533 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:27.533 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:27.533 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:27.533 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:27.533 00:07:27.533 real 0m0.401s 00:07:27.533 user 0m0.243s 00:07:27.533 sys 0m0.184s 00:07:27.533 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:27.533 17:19:38 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:27.533 ************************************ 00:07:27.533 END TEST accel_missing_filename 00:07:27.533 ************************************ 00:07:27.794 17:19:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:27.794 17:19:38 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:27.794 17:19:38 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:27.794 17:19:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.794 17:19:38 accel -- common/autotest_common.sh@10 -- # set +x 00:07:27.794 ************************************ 00:07:27.794 START TEST accel_compress_verify 00:07:27.794 ************************************ 00:07:27.794 17:19:38 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:27.794 17:19:38 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:27.794 17:19:38 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:27.794 17:19:38 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:27.795 17:19:38 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:27.795 17:19:38 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:27.795 17:19:38 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:27.795 17:19:38 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:27.795 17:19:38 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:27.795 17:19:38 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:27.795 17:19:38 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:27.795 17:19:38 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:27.795 17:19:38 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.795 17:19:38 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.795 17:19:38 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:27.795 17:19:38 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:27.795 17:19:38 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:27.795 [2024-07-15 17:19:38.919780] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:27.795 [2024-07-15 17:19:38.919845] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709042 ] 00:07:27.795 [2024-07-15 17:19:39.010027] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.795 [2024-07-15 17:19:39.084781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.055 [2024-07-15 17:19:39.135423] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:28.055 [2024-07-15 17:19:39.172578] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:07:28.055 00:07:28.055 Compression does not support the verify option, aborting. 00:07:28.055 17:19:39 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:28.055 17:19:39 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:28.055 17:19:39 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:28.055 17:19:39 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:28.055 17:19:39 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:28.055 17:19:39 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:28.056 00:07:28.056 real 0m0.340s 00:07:28.056 user 0m0.231s 00:07:28.056 sys 0m0.134s 00:07:28.056 17:19:39 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:28.056 17:19:39 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:28.056 ************************************ 00:07:28.056 END TEST accel_compress_verify 00:07:28.056 ************************************ 00:07:28.056 17:19:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:28.056 17:19:39 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:28.056 17:19:39 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:28.056 17:19:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.056 17:19:39 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.056 ************************************ 00:07:28.056 START TEST accel_wrong_workload 00:07:28.056 ************************************ 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:28.056 17:19:39 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:28.056 17:19:39 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:28.056 17:19:39 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:28.056 17:19:39 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:28.056 17:19:39 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.056 17:19:39 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.056 17:19:39 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:28.056 17:19:39 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:28.056 17:19:39 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:28.056 Unsupported workload type: foobar 00:07:28.056 [2024-07-15 17:19:39.330384] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:28.056 accel_perf options: 00:07:28.056 [-h help message] 00:07:28.056 [-q queue depth per core] 00:07:28.056 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:28.056 [-T number of threads per core 00:07:28.056 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:28.056 [-t time in seconds] 00:07:28.056 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:28.056 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:28.056 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:28.056 [-l for compress/decompress workloads, name of uncompressed input file 00:07:28.056 [-S for crc32c workload, use this seed value (default 0) 00:07:28.056 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:28.056 [-f for fill workload, use this BYTE value (default 255) 00:07:28.056 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:28.056 [-y verify result if this switch is on] 00:07:28.056 [-a tasks to allocate per core (default: same value as -q)] 00:07:28.056 Can be used to spread operations across a wider range of memory. 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:28.056 00:07:28.056 real 0m0.042s 00:07:28.056 user 0m0.055s 00:07:28.056 sys 0m0.014s 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:28.056 17:19:39 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:28.056 ************************************ 00:07:28.056 END TEST accel_wrong_workload 00:07:28.056 ************************************ 00:07:28.317 17:19:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:28.317 17:19:39 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:28.317 17:19:39 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:28.317 17:19:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.317 17:19:39 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.317 ************************************ 00:07:28.317 START TEST accel_negative_buffers 00:07:28.317 ************************************ 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:28.317 17:19:39 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:28.317 17:19:39 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:28.317 17:19:39 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:28.317 17:19:39 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:28.317 17:19:39 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.317 17:19:39 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.317 17:19:39 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:28.317 17:19:39 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:28.317 17:19:39 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:28.317 -x option must be non-negative. 00:07:28.317 [2024-07-15 17:19:39.447585] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:28.317 accel_perf options: 00:07:28.317 [-h help message] 00:07:28.317 [-q queue depth per core] 00:07:28.317 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:28.317 [-T number of threads per core 00:07:28.317 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:28.317 [-t time in seconds] 00:07:28.317 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:28.317 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:28.317 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:28.317 [-l for compress/decompress workloads, name of uncompressed input file 00:07:28.317 [-S for crc32c workload, use this seed value (default 0) 00:07:28.317 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:28.317 [-f for fill workload, use this BYTE value (default 255) 00:07:28.317 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:28.317 [-y verify result if this switch is on] 00:07:28.317 [-a tasks to allocate per core (default: same value as -q)] 00:07:28.317 Can be used to spread operations across a wider range of memory. 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:28.317 00:07:28.317 real 0m0.042s 00:07:28.317 user 0m0.030s 00:07:28.317 sys 0m0.011s 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:28.317 17:19:39 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:28.317 ************************************ 00:07:28.317 END TEST accel_negative_buffers 00:07:28.317 ************************************ 00:07:28.317 Error: writing output failed: Broken pipe 00:07:28.317 17:19:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:28.317 17:19:39 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:28.317 17:19:39 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:28.317 17:19:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.317 17:19:39 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.317 ************************************ 00:07:28.317 START TEST accel_crc32c 00:07:28.317 ************************************ 00:07:28.318 17:19:39 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:28.318 17:19:39 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:28.318 [2024-07-15 17:19:39.565064] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:28.318 [2024-07-15 17:19:39.565127] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709365 ] 00:07:28.579 [2024-07-15 17:19:39.656470] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.579 [2024-07-15 17:19:39.733438] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.579 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.580 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.580 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.580 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.580 17:19:39 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.580 17:19:39 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.580 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.580 17:19:39 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:29.964 17:19:40 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:29.964 00:07:29.964 real 0m1.338s 00:07:29.964 user 0m1.207s 00:07:29.964 sys 0m0.135s 00:07:29.964 17:19:40 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:29.964 17:19:40 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:29.964 ************************************ 00:07:29.964 END TEST accel_crc32c 00:07:29.964 ************************************ 00:07:29.964 17:19:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:29.964 17:19:40 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:29.964 17:19:40 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:29.964 17:19:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:29.964 17:19:40 accel -- common/autotest_common.sh@10 -- # set +x 00:07:29.964 ************************************ 00:07:29.964 START TEST accel_crc32c_C2 00:07:29.964 ************************************ 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:29.964 17:19:40 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:29.964 [2024-07-15 17:19:40.977379] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:29.964 [2024-07-15 17:19:40.977433] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709548 ] 00:07:29.964 [2024-07-15 17:19:41.066665] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.964 [2024-07-15 17:19:41.143243] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.964 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.965 17:19:41 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:31.347 00:07:31.347 real 0m1.344s 00:07:31.347 user 0m1.211s 00:07:31.347 sys 0m0.129s 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:31.347 17:19:42 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:31.347 ************************************ 00:07:31.347 END TEST accel_crc32c_C2 00:07:31.347 ************************************ 00:07:31.347 17:19:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:31.347 17:19:42 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:31.347 17:19:42 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:31.347 17:19:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.347 17:19:42 accel -- common/autotest_common.sh@10 -- # set +x 00:07:31.347 ************************************ 00:07:31.347 START TEST accel_copy 00:07:31.347 ************************************ 00:07:31.347 17:19:42 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:31.347 [2024-07-15 17:19:42.393144] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:31.347 [2024-07-15 17:19:42.393203] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709740 ] 00:07:31.347 [2024-07-15 17:19:42.462030] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.347 [2024-07-15 17:19:42.524568] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.347 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.348 17:19:42 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:32.731 17:19:43 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:32.731 00:07:32.731 real 0m1.304s 00:07:32.731 user 0m1.194s 00:07:32.731 sys 0m0.108s 00:07:32.731 17:19:43 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:32.731 17:19:43 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:32.731 ************************************ 00:07:32.731 END TEST accel_copy 00:07:32.731 ************************************ 00:07:32.731 17:19:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:32.731 17:19:43 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:32.731 17:19:43 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:32.731 17:19:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:32.731 17:19:43 accel -- common/autotest_common.sh@10 -- # set +x 00:07:32.731 ************************************ 00:07:32.731 START TEST accel_fill 00:07:32.731 ************************************ 00:07:32.731 17:19:43 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:32.731 [2024-07-15 17:19:43.741768] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:32.731 [2024-07-15 17:19:43.741817] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2710040 ] 00:07:32.731 [2024-07-15 17:19:43.832656] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.731 [2024-07-15 17:19:43.907687] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:32.731 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.732 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.732 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:32.732 17:19:43 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:32.732 17:19:43 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:32.732 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:32.732 17:19:43 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:34.164 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:34.165 17:19:45 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:34.165 17:19:45 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:34.165 00:07:34.165 real 0m1.318s 00:07:34.165 user 0m1.199s 00:07:34.165 sys 0m0.124s 00:07:34.165 17:19:45 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:34.165 17:19:45 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:34.165 ************************************ 00:07:34.165 END TEST accel_fill 00:07:34.165 ************************************ 00:07:34.165 17:19:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:34.165 17:19:45 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:34.165 17:19:45 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:34.165 17:19:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.165 17:19:45 accel -- common/autotest_common.sh@10 -- # set +x 00:07:34.165 ************************************ 00:07:34.165 START TEST accel_copy_crc32c 00:07:34.165 ************************************ 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:34.165 [2024-07-15 17:19:45.146715] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:34.165 [2024-07-15 17:19:45.146780] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2710356 ] 00:07:34.165 [2024-07-15 17:19:45.236220] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.165 [2024-07-15 17:19:45.312578] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.165 17:19:45 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:35.546 00:07:35.546 real 0m1.333s 00:07:35.546 user 0m1.208s 00:07:35.546 sys 0m0.131s 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.546 17:19:46 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:35.546 ************************************ 00:07:35.546 END TEST accel_copy_crc32c 00:07:35.546 ************************************ 00:07:35.546 17:19:46 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:35.546 17:19:46 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:35.546 17:19:46 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:35.546 17:19:46 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.546 17:19:46 accel -- common/autotest_common.sh@10 -- # set +x 00:07:35.546 ************************************ 00:07:35.546 START TEST accel_copy_crc32c_C2 00:07:35.546 ************************************ 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:35.546 [2024-07-15 17:19:46.555540] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:35.546 [2024-07-15 17:19:46.555609] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2710617 ] 00:07:35.546 [2024-07-15 17:19:46.647200] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.546 [2024-07-15 17:19:46.723584] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:35.546 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:35.547 17:19:46 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:36.927 00:07:36.927 real 0m1.339s 00:07:36.927 user 0m1.210s 00:07:36.927 sys 0m0.131s 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:36.927 17:19:47 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:36.927 ************************************ 00:07:36.927 END TEST accel_copy_crc32c_C2 00:07:36.927 ************************************ 00:07:36.927 17:19:47 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:36.927 17:19:47 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:36.927 17:19:47 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:36.927 17:19:47 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.927 17:19:47 accel -- common/autotest_common.sh@10 -- # set +x 00:07:36.927 ************************************ 00:07:36.927 START TEST accel_dualcast 00:07:36.927 ************************************ 00:07:36.927 17:19:47 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:36.927 17:19:47 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:36.927 [2024-07-15 17:19:47.966134] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:36.927 [2024-07-15 17:19:47.966199] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2710736 ] 00:07:36.927 [2024-07-15 17:19:48.054288] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.927 [2024-07-15 17:19:48.131615] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.927 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.928 17:19:48 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:38.308 17:19:49 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:38.308 17:19:49 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:38.308 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:38.308 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:38.308 17:19:49 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:38.308 17:19:49 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:38.308 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:38.308 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:38.309 17:19:49 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:38.309 00:07:38.309 real 0m1.344s 00:07:38.309 user 0m1.210s 00:07:38.309 sys 0m0.128s 00:07:38.309 17:19:49 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.309 17:19:49 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:38.309 ************************************ 00:07:38.309 END TEST accel_dualcast 00:07:38.309 ************************************ 00:07:38.309 17:19:49 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:38.309 17:19:49 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:38.309 17:19:49 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:38.309 17:19:49 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.309 17:19:49 accel -- common/autotest_common.sh@10 -- # set +x 00:07:38.309 ************************************ 00:07:38.309 START TEST accel_compare 00:07:38.309 ************************************ 00:07:38.309 17:19:49 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:38.309 17:19:49 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:38.309 [2024-07-15 17:19:49.388903] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:38.309 [2024-07-15 17:19:49.389028] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2711040 ] 00:07:38.309 [2024-07-15 17:19:49.530265] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.569 [2024-07-15 17:19:49.606193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.569 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:38.570 17:19:49 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:39.509 17:19:50 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:39.509 00:07:39.509 real 0m1.393s 00:07:39.509 user 0m0.007s 00:07:39.509 sys 0m0.001s 00:07:39.509 17:19:50 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:39.509 17:19:50 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:39.509 ************************************ 00:07:39.509 END TEST accel_compare 00:07:39.509 ************************************ 00:07:39.509 17:19:50 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:39.509 17:19:50 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:39.509 17:19:50 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:39.509 17:19:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.509 17:19:50 accel -- common/autotest_common.sh@10 -- # set +x 00:07:39.770 ************************************ 00:07:39.770 START TEST accel_xor 00:07:39.770 ************************************ 00:07:39.770 17:19:50 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:39.770 17:19:50 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:39.770 [2024-07-15 17:19:50.842864] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:39.770 [2024-07-15 17:19:50.842923] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2711358 ] 00:07:39.770 [2024-07-15 17:19:50.932081] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.770 [2024-07-15 17:19:51.006383] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.770 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.770 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.770 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.770 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.770 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.771 17:19:51 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:41.153 00:07:41.153 real 0m1.330s 00:07:41.153 user 0m1.199s 00:07:41.153 sys 0m0.122s 00:07:41.153 17:19:52 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.153 17:19:52 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:41.153 ************************************ 00:07:41.153 END TEST accel_xor 00:07:41.153 ************************************ 00:07:41.153 17:19:52 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:41.153 17:19:52 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:41.153 17:19:52 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:41.153 17:19:52 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.153 17:19:52 accel -- common/autotest_common.sh@10 -- # set +x 00:07:41.153 ************************************ 00:07:41.153 START TEST accel_xor 00:07:41.153 ************************************ 00:07:41.153 17:19:52 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:41.153 [2024-07-15 17:19:52.237889] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:41.153 [2024-07-15 17:19:52.237944] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2711675 ] 00:07:41.153 [2024-07-15 17:19:52.327840] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.153 [2024-07-15 17:19:52.401805] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.153 17:19:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.531 17:19:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.531 17:19:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.531 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.531 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:42.532 17:19:53 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:42.532 00:07:42.532 real 0m1.323s 00:07:42.532 user 0m1.196s 00:07:42.532 sys 0m0.123s 00:07:42.532 17:19:53 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:42.532 17:19:53 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:42.532 ************************************ 00:07:42.532 END TEST accel_xor 00:07:42.532 ************************************ 00:07:42.532 17:19:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:42.532 17:19:53 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:42.532 17:19:53 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:42.532 17:19:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:42.532 17:19:53 accel -- common/autotest_common.sh@10 -- # set +x 00:07:42.532 ************************************ 00:07:42.532 START TEST accel_dif_verify 00:07:42.532 ************************************ 00:07:42.532 17:19:53 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:42.532 17:19:53 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:42.532 [2024-07-15 17:19:53.634281] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:42.532 [2024-07-15 17:19:53.634339] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2711841 ] 00:07:42.532 [2024-07-15 17:19:53.724794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.532 [2024-07-15 17:19:53.799788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.791 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.792 17:19:53 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:43.728 17:19:54 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:43.729 17:19:54 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:43.729 00:07:43.729 real 0m1.340s 00:07:43.729 user 0m0.008s 00:07:43.729 sys 0m0.001s 00:07:43.729 17:19:54 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:43.729 17:19:54 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:43.729 ************************************ 00:07:43.729 END TEST accel_dif_verify 00:07:43.729 ************************************ 00:07:43.729 17:19:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:43.729 17:19:54 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:43.729 17:19:54 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:43.729 17:19:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:43.729 17:19:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:43.729 ************************************ 00:07:43.729 START TEST accel_dif_generate 00:07:43.729 ************************************ 00:07:43.729 17:19:55 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:07:43.729 17:19:55 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:07:43.988 [2024-07-15 17:19:55.042763] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:43.988 [2024-07-15 17:19:55.042827] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2712049 ] 00:07:43.988 [2024-07-15 17:19:55.130388] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.988 [2024-07-15 17:19:55.206306] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.988 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:43.989 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.989 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.989 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:43.989 17:19:55 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:43.989 17:19:55 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:43.989 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:43.989 17:19:55 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:45.370 17:19:56 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:45.370 00:07:45.370 real 0m1.334s 00:07:45.370 user 0m1.193s 00:07:45.370 sys 0m0.130s 00:07:45.370 17:19:56 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:45.370 17:19:56 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:07:45.370 ************************************ 00:07:45.370 END TEST accel_dif_generate 00:07:45.370 ************************************ 00:07:45.370 17:19:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:45.370 17:19:56 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:45.370 17:19:56 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:45.370 17:19:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:45.370 17:19:56 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.370 ************************************ 00:07:45.370 START TEST accel_dif_generate_copy 00:07:45.370 ************************************ 00:07:45.370 17:19:56 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:07:45.370 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:45.370 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:07:45.371 [2024-07-15 17:19:56.445589] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:45.371 [2024-07-15 17:19:56.445652] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2712356 ] 00:07:45.371 [2024-07-15 17:19:56.534631] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.371 [2024-07-15 17:19:56.610556] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.371 17:19:56 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:46.753 00:07:46.753 real 0m1.331s 00:07:46.753 user 0m1.201s 00:07:46.753 sys 0m0.121s 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.753 17:19:57 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:07:46.753 ************************************ 00:07:46.753 END TEST accel_dif_generate_copy 00:07:46.753 ************************************ 00:07:46.753 17:19:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:46.753 17:19:57 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:07:46.753 17:19:57 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:46.753 17:19:57 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:46.753 17:19:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.753 17:19:57 accel -- common/autotest_common.sh@10 -- # set +x 00:07:46.753 ************************************ 00:07:46.753 START TEST accel_comp 00:07:46.753 ************************************ 00:07:46.753 17:19:57 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:46.753 17:19:57 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:07:46.753 [2024-07-15 17:19:57.851792] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:46.753 [2024-07-15 17:19:57.851901] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2712673 ] 00:07:46.753 [2024-07-15 17:19:57.949453] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.753 [2024-07-15 17:19:58.023903] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.014 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.014 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.014 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.015 17:19:58 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:47.955 17:19:59 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:47.955 00:07:47.956 real 0m1.340s 00:07:47.956 user 0m1.189s 00:07:47.956 sys 0m0.146s 00:07:47.956 17:19:59 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.956 17:19:59 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:07:47.956 ************************************ 00:07:47.956 END TEST accel_comp 00:07:47.956 ************************************ 00:07:47.956 17:19:59 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.956 17:19:59 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:47.956 17:19:59 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:47.956 17:19:59 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.956 17:19:59 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.956 ************************************ 00:07:47.956 START TEST accel_decomp 00:07:47.956 ************************************ 00:07:47.956 17:19:59 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:47.956 17:19:59 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:48.216 [2024-07-15 17:19:59.257869] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:48.216 [2024-07-15 17:19:59.257934] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2712926 ] 00:07:48.216 [2024-07-15 17:19:59.345980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.216 [2024-07-15 17:19:59.421958] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:48.216 17:19:59 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:49.600 17:20:00 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:49.600 00:07:49.600 real 0m1.334s 00:07:49.600 user 0m1.197s 00:07:49.600 sys 0m0.128s 00:07:49.600 17:20:00 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:49.600 17:20:00 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:49.600 ************************************ 00:07:49.600 END TEST accel_decomp 00:07:49.600 ************************************ 00:07:49.600 17:20:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:49.600 17:20:00 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:49.600 17:20:00 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:49.600 17:20:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:49.600 17:20:00 accel -- common/autotest_common.sh@10 -- # set +x 00:07:49.600 ************************************ 00:07:49.600 START TEST accel_decomp_full 00:07:49.600 ************************************ 00:07:49.600 17:20:00 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:49.600 [2024-07-15 17:20:00.661768] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:49.600 [2024-07-15 17:20:00.661831] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2713052 ] 00:07:49.600 [2024-07-15 17:20:00.750088] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.600 [2024-07-15 17:20:00.820727] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.600 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.601 17:20:00 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:50.982 17:20:01 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:50.982 00:07:50.982 real 0m1.339s 00:07:50.982 user 0m0.009s 00:07:50.982 sys 0m0.000s 00:07:50.982 17:20:01 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:50.982 17:20:01 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:50.982 ************************************ 00:07:50.982 END TEST accel_decomp_full 00:07:50.982 ************************************ 00:07:50.982 17:20:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:50.982 17:20:02 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:50.982 17:20:02 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:50.982 17:20:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:50.982 17:20:02 accel -- common/autotest_common.sh@10 -- # set +x 00:07:50.982 ************************************ 00:07:50.982 START TEST accel_decomp_mcore 00:07:50.982 ************************************ 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:50.982 [2024-07-15 17:20:02.070839] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:50.982 [2024-07-15 17:20:02.070895] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2713349 ] 00:07:50.982 [2024-07-15 17:20:02.158636] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:50.982 [2024-07-15 17:20:02.225079] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:50.982 [2024-07-15 17:20:02.225222] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:50.982 [2024-07-15 17:20:02.225340] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.982 [2024-07-15 17:20:02.225340] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.982 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.983 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:51.285 17:20:02 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:52.235 00:07:52.235 real 0m1.344s 00:07:52.235 user 0m4.498s 00:07:52.235 sys 0m0.137s 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:52.235 17:20:03 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:52.235 ************************************ 00:07:52.235 END TEST accel_decomp_mcore 00:07:52.235 ************************************ 00:07:52.235 17:20:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:52.235 17:20:03 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:52.235 17:20:03 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:52.235 17:20:03 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:52.235 17:20:03 accel -- common/autotest_common.sh@10 -- # set +x 00:07:52.235 ************************************ 00:07:52.235 START TEST accel_decomp_full_mcore 00:07:52.235 ************************************ 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:52.235 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:52.235 [2024-07-15 17:20:03.487899] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:52.235 [2024-07-15 17:20:03.487956] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2713671 ] 00:07:52.496 [2024-07-15 17:20:03.576144] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:52.497 [2024-07-15 17:20:03.642122] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:52.497 [2024-07-15 17:20:03.642268] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:52.497 [2024-07-15 17:20:03.642415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.497 [2024-07-15 17:20:03.642415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.497 17:20:03 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.879 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:53.880 00:07:53.880 real 0m1.391s 00:07:53.880 user 0m4.681s 00:07:53.880 sys 0m0.133s 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:53.880 17:20:04 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:53.880 ************************************ 00:07:53.880 END TEST accel_decomp_full_mcore 00:07:53.880 ************************************ 00:07:53.880 17:20:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:53.880 17:20:04 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:53.880 17:20:04 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:53.880 17:20:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:53.880 17:20:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:53.880 ************************************ 00:07:53.880 START TEST accel_decomp_mthread 00:07:53.880 ************************************ 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:53.880 17:20:04 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:53.880 [2024-07-15 17:20:04.953911] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:53.880 [2024-07-15 17:20:04.953973] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2713998 ] 00:07:53.880 [2024-07-15 17:20:05.041592] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.880 [2024-07-15 17:20:05.104542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.880 17:20:05 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.262 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:55.263 00:07:55.263 real 0m1.323s 00:07:55.263 user 0m1.210s 00:07:55.263 sys 0m0.119s 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:55.263 17:20:06 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:55.263 ************************************ 00:07:55.263 END TEST accel_decomp_mthread 00:07:55.263 ************************************ 00:07:55.263 17:20:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:55.263 17:20:06 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:55.263 17:20:06 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:55.263 17:20:06 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:55.263 17:20:06 accel -- common/autotest_common.sh@10 -- # set +x 00:07:55.263 ************************************ 00:07:55.263 START TEST accel_decomp_full_mthread 00:07:55.263 ************************************ 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:55.263 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:55.263 [2024-07-15 17:20:06.351201] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:55.263 [2024-07-15 17:20:06.351256] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2714165 ] 00:07:55.263 [2024-07-15 17:20:06.442333] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.263 [2024-07-15 17:20:06.516809] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.523 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.524 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.524 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.524 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.524 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.524 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.524 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.524 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.524 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.524 17:20:06 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:56.464 00:07:56.464 real 0m1.363s 00:07:56.464 user 0m1.235s 00:07:56.464 sys 0m0.133s 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:56.464 17:20:07 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:56.464 ************************************ 00:07:56.464 END TEST accel_decomp_full_mthread 00:07:56.464 ************************************ 00:07:56.464 17:20:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:56.464 17:20:07 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:07:56.464 17:20:07 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:07:56.464 17:20:07 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:07:56.464 17:20:07 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:56.464 17:20:07 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2714367 00:07:56.464 17:20:07 accel -- accel/accel.sh@63 -- # waitforlisten 2714367 00:07:56.464 17:20:07 accel -- common/autotest_common.sh@829 -- # '[' -z 2714367 ']' 00:07:56.464 17:20:07 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:56.464 17:20:07 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:56.464 17:20:07 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:56.464 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:56.464 17:20:07 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:56.464 17:20:07 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:56.464 17:20:07 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:56.464 17:20:07 accel -- common/autotest_common.sh@10 -- # set +x 00:07:56.464 17:20:07 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:56.464 17:20:07 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:56.464 17:20:07 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:56.464 17:20:07 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:56.464 17:20:07 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:56.464 17:20:07 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:56.464 17:20:07 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:56.464 17:20:07 accel -- accel/accel.sh@41 -- # jq -r . 00:07:56.724 [2024-07-15 17:20:07.779887] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:56.724 [2024-07-15 17:20:07.779943] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2714367 ] 00:07:56.724 [2024-07-15 17:20:07.866756] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.724 [2024-07-15 17:20:07.939862] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.294 [2024-07-15 17:20:08.349409] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@862 -- # return 0 00:07:57.555 17:20:08 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:57.555 17:20:08 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:57.555 17:20:08 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:57.555 17:20:08 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:07:57.555 17:20:08 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:07:57.555 17:20:08 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@10 -- # set +x 00:07:57.555 17:20:08 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:57.555 17:20:08 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:57.555 "method": "compressdev_scan_accel_module", 00:07:57.555 17:20:08 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:57.555 17:20:08 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:57.555 17:20:08 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@10 -- # set +x 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # IFS== 00:07:57.555 17:20:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:57.555 17:20:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:57.555 17:20:08 accel -- accel/accel.sh@75 -- # killprocess 2714367 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@948 -- # '[' -z 2714367 ']' 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@952 -- # kill -0 2714367 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@953 -- # uname 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:57.555 17:20:08 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2714367 00:07:57.814 17:20:08 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:57.814 17:20:08 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:57.814 17:20:08 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2714367' 00:07:57.814 killing process with pid 2714367 00:07:57.814 17:20:08 accel -- common/autotest_common.sh@967 -- # kill 2714367 00:07:57.814 17:20:08 accel -- common/autotest_common.sh@972 -- # wait 2714367 00:07:57.814 17:20:09 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:57.814 17:20:09 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:57.814 17:20:09 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:57.814 17:20:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.814 17:20:09 accel -- common/autotest_common.sh@10 -- # set +x 00:07:58.074 ************************************ 00:07:58.074 START TEST accel_cdev_comp 00:07:58.074 ************************************ 00:07:58.074 17:20:09 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:58.074 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:07:58.074 [2024-07-15 17:20:09.178044] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:58.074 [2024-07-15 17:20:09.178105] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2714671 ] 00:07:58.074 [2024-07-15 17:20:09.268114] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.074 [2024-07-15 17:20:09.345083] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.643 [2024-07-15 17:20:09.755784] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:58.643 [2024-07-15 17:20:09.757530] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x140a640 PMD being used: compress_qat 00:07:58.643 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.643 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 [2024-07-15 17:20:09.760576] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x160f3f0 PMD being used: compress_qat 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.644 17:20:09 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:59.582 17:20:10 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:59.582 00:07:59.582 real 0m1.719s 00:07:59.582 user 0m1.422s 00:07:59.582 sys 0m0.297s 00:07:59.582 17:20:10 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:59.582 17:20:10 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:07:59.582 ************************************ 00:07:59.582 END TEST accel_cdev_comp 00:07:59.582 ************************************ 00:07:59.842 17:20:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:59.843 17:20:10 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:59.843 17:20:10 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:59.843 17:20:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:59.843 17:20:10 accel -- common/autotest_common.sh@10 -- # set +x 00:07:59.843 ************************************ 00:07:59.843 START TEST accel_cdev_decomp 00:07:59.843 ************************************ 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:59.843 17:20:10 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:59.843 [2024-07-15 17:20:10.974834] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:07:59.843 [2024-07-15 17:20:10.974894] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2714994 ] 00:07:59.843 [2024-07-15 17:20:11.066191] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.103 [2024-07-15 17:20:11.141584] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.363 [2024-07-15 17:20:11.557633] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:00.363 [2024-07-15 17:20:11.559380] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2167640 PMD being used: compress_qat 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.363 [2024-07-15 17:20:11.562570] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x236c3f0 PMD being used: compress_qat 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:00.363 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.364 17:20:11 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.774 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:01.775 00:08:01.775 real 0m1.725s 00:08:01.775 user 0m1.421s 00:08:01.775 sys 0m0.301s 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:01.775 17:20:12 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:01.775 ************************************ 00:08:01.775 END TEST accel_cdev_decomp 00:08:01.775 ************************************ 00:08:01.775 17:20:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:01.775 17:20:12 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:01.775 17:20:12 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:01.775 17:20:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.775 17:20:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:01.775 ************************************ 00:08:01.775 START TEST accel_cdev_decomp_full 00:08:01.775 ************************************ 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:01.775 17:20:12 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:01.775 [2024-07-15 17:20:12.780414] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:01.775 [2024-07-15 17:20:12.780533] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2715320 ] 00:08:01.775 [2024-07-15 17:20:12.880282] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.775 [2024-07-15 17:20:12.956487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.345 [2024-07-15 17:20:13.359990] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:02.345 [2024-07-15 17:20:13.361737] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1405640 PMD being used: compress_qat 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 [2024-07-15 17:20:13.364062] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1408970 PMD being used: compress_qat 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.345 17:20:13 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:03.282 00:08:03.282 real 0m1.724s 00:08:03.282 user 0m1.420s 00:08:03.282 sys 0m0.307s 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.282 17:20:14 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:03.282 ************************************ 00:08:03.282 END TEST accel_cdev_decomp_full 00:08:03.282 ************************************ 00:08:03.282 17:20:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.282 17:20:14 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:03.282 17:20:14 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:03.282 17:20:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.282 17:20:14 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.282 ************************************ 00:08:03.282 START TEST accel_cdev_decomp_mcore 00:08:03.282 ************************************ 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:03.282 17:20:14 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:03.282 [2024-07-15 17:20:14.575843] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:03.282 [2024-07-15 17:20:14.575944] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2715633 ] 00:08:03.541 [2024-07-15 17:20:14.677082] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:03.541 [2024-07-15 17:20:14.755740] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:03.541 [2024-07-15 17:20:14.759744] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:03.541 [2024-07-15 17:20:14.763743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:03.541 [2024-07-15 17:20:14.763790] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.108 [2024-07-15 17:20:15.164599] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:04.108 [2024-07-15 17:20:15.166342] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x18efce0 PMD being used: compress_qat 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.108 [2024-07-15 17:20:15.170548] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fcc2019b8b0 PMD being used: compress_qat 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.108 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 [2024-07-15 17:20:15.172222] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a15760 PMD being used: compress_qat 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 [2024-07-15 17:20:15.177550] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fcc1819b8b0 PMD being used: compress_qat 00:08:04.109 [2024-07-15 17:20:15.177735] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fcc1019b8b0 PMD being used: compress_qat 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.109 17:20:15 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:05.048 00:08:05.048 real 0m1.748s 00:08:05.048 user 0m5.808s 00:08:05.048 sys 0m0.316s 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:05.048 17:20:16 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:05.048 ************************************ 00:08:05.048 END TEST accel_cdev_decomp_mcore 00:08:05.048 ************************************ 00:08:05.048 17:20:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:05.048 17:20:16 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:05.048 17:20:16 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:05.048 17:20:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.048 17:20:16 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.309 ************************************ 00:08:05.309 START TEST accel_cdev_decomp_full_mcore 00:08:05.309 ************************************ 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:05.309 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:05.309 [2024-07-15 17:20:16.400361] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:05.309 [2024-07-15 17:20:16.400418] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2715960 ] 00:08:05.309 [2024-07-15 17:20:16.488139] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:05.309 [2024-07-15 17:20:16.557745] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:05.309 [2024-07-15 17:20:16.557890] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:05.309 [2024-07-15 17:20:16.558090] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:05.309 [2024-07-15 17:20:16.558091] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.879 [2024-07-15 17:20:16.960633] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:05.879 [2024-07-15 17:20:16.962392] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x202bce0 PMD being used: compress_qat 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 [2024-07-15 17:20:16.965884] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb74419b8b0 PMD being used: compress_qat 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:05.879 [2024-07-15 17:20:16.967576] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20311e0 PMD being used: compress_qat 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 [2024-07-15 17:20:16.972790] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb73c19b8b0 PMD being used: compress_qat 00:08:05.879 [2024-07-15 17:20:16.972977] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb73419b8b0 PMD being used: compress_qat 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.879 17:20:16 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:06.852 00:08:06.852 real 0m1.726s 00:08:06.852 user 0m5.831s 00:08:06.852 sys 0m0.307s 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:06.852 17:20:18 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:06.852 ************************************ 00:08:06.852 END TEST accel_cdev_decomp_full_mcore 00:08:06.852 ************************************ 00:08:07.119 17:20:18 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:07.119 17:20:18 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:07.119 17:20:18 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:07.119 17:20:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.119 17:20:18 accel -- common/autotest_common.sh@10 -- # set +x 00:08:07.119 ************************************ 00:08:07.119 START TEST accel_cdev_decomp_mthread 00:08:07.119 ************************************ 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:07.119 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:07.119 [2024-07-15 17:20:18.198383] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:07.119 [2024-07-15 17:20:18.198446] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2716289 ] 00:08:07.119 [2024-07-15 17:20:18.285922] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.119 [2024-07-15 17:20:18.361942] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.687 [2024-07-15 17:20:18.769656] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:07.687 [2024-07-15 17:20:18.771435] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b47640 PMD being used: compress_qat 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.687 [2024-07-15 17:20:18.774771] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b4c840 PMD being used: compress_qat 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.687 [2024-07-15 17:20:18.776502] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c6f320 PMD being used: compress_qat 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:07.687 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:07.688 17:20:18 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:08.627 00:08:08.627 real 0m1.713s 00:08:08.627 user 0m1.416s 00:08:08.627 sys 0m0.298s 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:08.627 17:20:19 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:08.627 ************************************ 00:08:08.627 END TEST accel_cdev_decomp_mthread 00:08:08.627 ************************************ 00:08:08.627 17:20:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:08.627 17:20:19 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:08.627 17:20:19 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:08.627 17:20:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:08.627 17:20:19 accel -- common/autotest_common.sh@10 -- # set +x 00:08:08.887 ************************************ 00:08:08.887 START TEST accel_cdev_decomp_full_mthread 00:08:08.887 ************************************ 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:08.887 17:20:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:08.887 [2024-07-15 17:20:19.985319] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:08.887 [2024-07-15 17:20:19.985388] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2716618 ] 00:08:08.887 [2024-07-15 17:20:20.074228] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.887 [2024-07-15 17:20:20.150759] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.455 [2024-07-15 17:20:20.550066] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:09.456 [2024-07-15 17:20:20.551809] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb72640 PMD being used: compress_qat 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 [2024-07-15 17:20:20.554358] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb726e0 PMD being used: compress_qat 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:09.456 [2024-07-15 17:20:20.556242] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xd772d0 PMD being used: compress_qat 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.456 17:20:20 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:10.396 00:08:10.396 real 0m1.708s 00:08:10.396 user 0m1.409s 00:08:10.396 sys 0m0.300s 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.396 17:20:21 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:10.396 ************************************ 00:08:10.396 END TEST accel_cdev_decomp_full_mthread 00:08:10.396 ************************************ 00:08:10.655 17:20:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:10.655 17:20:21 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:10.655 17:20:21 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:10.655 17:20:21 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:10.655 17:20:21 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:10.656 17:20:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.656 17:20:21 accel -- common/autotest_common.sh@10 -- # set +x 00:08:10.656 17:20:21 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:10.656 17:20:21 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:10.656 17:20:21 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:10.656 17:20:21 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:10.656 17:20:21 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:10.656 17:20:21 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:10.656 17:20:21 accel -- accel/accel.sh@41 -- # jq -r . 00:08:10.656 ************************************ 00:08:10.656 START TEST accel_dif_functional_tests 00:08:10.656 ************************************ 00:08:10.656 17:20:21 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:10.656 [2024-07-15 17:20:21.798437] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:10.656 [2024-07-15 17:20:21.798485] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2716938 ] 00:08:10.656 [2024-07-15 17:20:21.887383] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:10.915 [2024-07-15 17:20:21.961900] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:10.915 [2024-07-15 17:20:21.962136] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.915 [2024-07-15 17:20:21.962136] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:10.915 00:08:10.915 00:08:10.915 CUnit - A unit testing framework for C - Version 2.1-3 00:08:10.915 http://cunit.sourceforge.net/ 00:08:10.915 00:08:10.915 00:08:10.915 Suite: accel_dif 00:08:10.915 Test: verify: DIF generated, GUARD check ...passed 00:08:10.915 Test: verify: DIF generated, APPTAG check ...passed 00:08:10.915 Test: verify: DIF generated, REFTAG check ...passed 00:08:10.915 Test: verify: DIF not generated, GUARD check ...[2024-07-15 17:20:22.029044] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:10.915 passed 00:08:10.915 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 17:20:22.029093] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:10.915 passed 00:08:10.915 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 17:20:22.029114] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:10.915 passed 00:08:10.915 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:10.915 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 17:20:22.029164] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:10.915 passed 00:08:10.915 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:10.915 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:10.915 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:10.915 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 17:20:22.029277] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:10.915 passed 00:08:10.915 Test: verify copy: DIF generated, GUARD check ...passed 00:08:10.915 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:10.915 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:10.915 Test: verify copy: DIF not generated, GUARD check ...[2024-07-15 17:20:22.029402] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:10.915 passed 00:08:10.915 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-15 17:20:22.029427] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:10.915 passed 00:08:10.915 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-15 17:20:22.029449] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:10.915 passed 00:08:10.915 Test: generate copy: DIF generated, GUARD check ...passed 00:08:10.915 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:10.915 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:10.915 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:10.915 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:10.915 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:10.915 Test: generate copy: iovecs-len validate ...[2024-07-15 17:20:22.029642] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:10.915 passed 00:08:10.915 Test: generate copy: buffer alignment validate ...passed 00:08:10.915 00:08:10.915 Run Summary: Type Total Ran Passed Failed Inactive 00:08:10.915 suites 1 1 n/a 0 0 00:08:10.915 tests 26 26 26 0 0 00:08:10.915 asserts 115 115 115 0 n/a 00:08:10.915 00:08:10.916 Elapsed time = 0.000 seconds 00:08:10.916 00:08:10.916 real 0m0.407s 00:08:10.916 user 0m0.526s 00:08:10.916 sys 0m0.153s 00:08:10.916 17:20:22 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.916 17:20:22 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:10.916 ************************************ 00:08:10.916 END TEST accel_dif_functional_tests 00:08:10.916 ************************************ 00:08:10.916 17:20:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:10.916 00:08:10.916 real 0m45.208s 00:08:10.916 user 0m54.483s 00:08:10.916 sys 0m7.752s 00:08:10.916 17:20:22 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.916 17:20:22 accel -- common/autotest_common.sh@10 -- # set +x 00:08:10.916 ************************************ 00:08:10.916 END TEST accel 00:08:10.916 ************************************ 00:08:11.176 17:20:22 -- common/autotest_common.sh@1142 -- # return 0 00:08:11.176 17:20:22 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:11.176 17:20:22 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:11.176 17:20:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.176 17:20:22 -- common/autotest_common.sh@10 -- # set +x 00:08:11.176 ************************************ 00:08:11.176 START TEST accel_rpc 00:08:11.176 ************************************ 00:08:11.176 17:20:22 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:11.176 * Looking for test storage... 00:08:11.176 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:11.176 17:20:22 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:11.176 17:20:22 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2717023 00:08:11.176 17:20:22 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 2717023 00:08:11.176 17:20:22 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:11.176 17:20:22 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 2717023 ']' 00:08:11.176 17:20:22 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:11.176 17:20:22 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:11.176 17:20:22 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:11.176 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:11.176 17:20:22 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:11.176 17:20:22 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:11.176 [2024-07-15 17:20:22.417126] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:11.176 [2024-07-15 17:20:22.417191] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2717023 ] 00:08:11.436 [2024-07-15 17:20:22.509065] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.436 [2024-07-15 17:20:22.578414] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.006 17:20:23 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:12.006 17:20:23 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:12.006 17:20:23 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:12.006 17:20:23 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:12.006 17:20:23 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:12.006 17:20:23 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:12.006 17:20:23 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:12.006 17:20:23 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:12.006 17:20:23 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:12.006 17:20:23 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:12.006 ************************************ 00:08:12.006 START TEST accel_assign_opcode 00:08:12.006 ************************************ 00:08:12.006 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:12.006 17:20:23 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:12.006 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.006 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:12.006 [2024-07-15 17:20:23.288474] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:12.006 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.006 17:20:23 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:12.006 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.006 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:12.006 [2024-07-15 17:20:23.300501] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.265 software 00:08:12.265 00:08:12.265 real 0m0.214s 00:08:12.265 user 0m0.047s 00:08:12.265 sys 0m0.012s 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:12.265 17:20:23 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:12.265 ************************************ 00:08:12.265 END TEST accel_assign_opcode 00:08:12.265 ************************************ 00:08:12.265 17:20:23 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:12.265 17:20:23 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 2717023 00:08:12.265 17:20:23 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 2717023 ']' 00:08:12.265 17:20:23 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 2717023 00:08:12.265 17:20:23 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:12.265 17:20:23 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:12.265 17:20:23 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2717023 00:08:12.524 17:20:23 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:12.524 17:20:23 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:12.524 17:20:23 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2717023' 00:08:12.524 killing process with pid 2717023 00:08:12.524 17:20:23 accel_rpc -- common/autotest_common.sh@967 -- # kill 2717023 00:08:12.524 17:20:23 accel_rpc -- common/autotest_common.sh@972 -- # wait 2717023 00:08:12.524 00:08:12.524 real 0m1.533s 00:08:12.524 user 0m1.648s 00:08:12.524 sys 0m0.436s 00:08:12.524 17:20:23 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:12.524 17:20:23 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:12.524 ************************************ 00:08:12.524 END TEST accel_rpc 00:08:12.524 ************************************ 00:08:12.784 17:20:23 -- common/autotest_common.sh@1142 -- # return 0 00:08:12.784 17:20:23 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:12.784 17:20:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:12.784 17:20:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:12.784 17:20:23 -- common/autotest_common.sh@10 -- # set +x 00:08:12.784 ************************************ 00:08:12.784 START TEST app_cmdline 00:08:12.784 ************************************ 00:08:12.784 17:20:23 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:12.784 * Looking for test storage... 00:08:12.784 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:12.784 17:20:23 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:12.784 17:20:23 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=2717395 00:08:12.784 17:20:23 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 2717395 00:08:12.784 17:20:23 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 2717395 ']' 00:08:12.784 17:20:23 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:12.784 17:20:23 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:12.784 17:20:23 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:12.784 17:20:23 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:12.784 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:12.784 17:20:23 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:12.784 17:20:23 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:12.784 [2024-07-15 17:20:24.042523] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:12.784 [2024-07-15 17:20:24.042592] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2717395 ] 00:08:13.044 [2024-07-15 17:20:24.133680] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.044 [2024-07-15 17:20:24.201908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.613 17:20:24 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:13.613 17:20:24 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:13.613 17:20:24 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:13.873 { 00:08:13.873 "version": "SPDK v24.09-pre git sha1 248c547d0", 00:08:13.873 "fields": { 00:08:13.873 "major": 24, 00:08:13.873 "minor": 9, 00:08:13.873 "patch": 0, 00:08:13.873 "suffix": "-pre", 00:08:13.873 "commit": "248c547d0" 00:08:13.873 } 00:08:13.873 } 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:13.873 17:20:25 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:13.873 17:20:25 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:14.133 request: 00:08:14.133 { 00:08:14.133 "method": "env_dpdk_get_mem_stats", 00:08:14.133 "req_id": 1 00:08:14.133 } 00:08:14.133 Got JSON-RPC error response 00:08:14.133 response: 00:08:14.133 { 00:08:14.133 "code": -32601, 00:08:14.133 "message": "Method not found" 00:08:14.133 } 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:14.133 17:20:25 app_cmdline -- app/cmdline.sh@1 -- # killprocess 2717395 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 2717395 ']' 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 2717395 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2717395 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2717395' 00:08:14.133 killing process with pid 2717395 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@967 -- # kill 2717395 00:08:14.133 17:20:25 app_cmdline -- common/autotest_common.sh@972 -- # wait 2717395 00:08:14.393 00:08:14.394 real 0m1.640s 00:08:14.394 user 0m1.975s 00:08:14.394 sys 0m0.449s 00:08:14.394 17:20:25 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:14.394 17:20:25 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:14.394 ************************************ 00:08:14.394 END TEST app_cmdline 00:08:14.394 ************************************ 00:08:14.394 17:20:25 -- common/autotest_common.sh@1142 -- # return 0 00:08:14.394 17:20:25 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:14.394 17:20:25 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:14.394 17:20:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.394 17:20:25 -- common/autotest_common.sh@10 -- # set +x 00:08:14.394 ************************************ 00:08:14.394 START TEST version 00:08:14.394 ************************************ 00:08:14.394 17:20:25 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:14.394 * Looking for test storage... 00:08:14.394 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:14.394 17:20:25 version -- app/version.sh@17 -- # get_header_version major 00:08:14.394 17:20:25 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:14.394 17:20:25 version -- app/version.sh@14 -- # cut -f2 00:08:14.394 17:20:25 version -- app/version.sh@14 -- # tr -d '"' 00:08:14.654 17:20:25 version -- app/version.sh@17 -- # major=24 00:08:14.654 17:20:25 version -- app/version.sh@18 -- # get_header_version minor 00:08:14.654 17:20:25 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:14.654 17:20:25 version -- app/version.sh@14 -- # cut -f2 00:08:14.654 17:20:25 version -- app/version.sh@14 -- # tr -d '"' 00:08:14.654 17:20:25 version -- app/version.sh@18 -- # minor=9 00:08:14.654 17:20:25 version -- app/version.sh@19 -- # get_header_version patch 00:08:14.654 17:20:25 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:14.654 17:20:25 version -- app/version.sh@14 -- # cut -f2 00:08:14.654 17:20:25 version -- app/version.sh@14 -- # tr -d '"' 00:08:14.654 17:20:25 version -- app/version.sh@19 -- # patch=0 00:08:14.655 17:20:25 version -- app/version.sh@20 -- # get_header_version suffix 00:08:14.655 17:20:25 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:14.655 17:20:25 version -- app/version.sh@14 -- # cut -f2 00:08:14.655 17:20:25 version -- app/version.sh@14 -- # tr -d '"' 00:08:14.655 17:20:25 version -- app/version.sh@20 -- # suffix=-pre 00:08:14.655 17:20:25 version -- app/version.sh@22 -- # version=24.9 00:08:14.655 17:20:25 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:14.655 17:20:25 version -- app/version.sh@28 -- # version=24.9rc0 00:08:14.655 17:20:25 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:14.655 17:20:25 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:14.655 17:20:25 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:14.655 17:20:25 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:14.655 00:08:14.655 real 0m0.185s 00:08:14.655 user 0m0.085s 00:08:14.655 sys 0m0.144s 00:08:14.655 17:20:25 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:14.655 17:20:25 version -- common/autotest_common.sh@10 -- # set +x 00:08:14.655 ************************************ 00:08:14.655 END TEST version 00:08:14.655 ************************************ 00:08:14.655 17:20:25 -- common/autotest_common.sh@1142 -- # return 0 00:08:14.655 17:20:25 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:14.655 17:20:25 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:14.655 17:20:25 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:14.655 17:20:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.655 17:20:25 -- common/autotest_common.sh@10 -- # set +x 00:08:14.655 ************************************ 00:08:14.655 START TEST blockdev_general 00:08:14.655 ************************************ 00:08:14.655 17:20:25 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:14.655 * Looking for test storage... 00:08:14.655 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:14.655 17:20:25 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:14.655 17:20:25 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:14.655 17:20:25 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:14.655 17:20:25 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:14.655 17:20:25 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:14.655 17:20:25 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:14.655 17:20:25 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:14.655 17:20:25 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:14.915 17:20:25 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:14.915 17:20:25 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:14.915 17:20:25 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:14.915 17:20:25 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:14.915 17:20:25 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:14.915 17:20:25 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:14.915 17:20:25 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:14.915 17:20:25 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:14.915 17:20:25 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2717835 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 2717835 00:08:14.916 17:20:25 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 2717835 ']' 00:08:14.916 17:20:25 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:14.916 17:20:25 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:14.916 17:20:25 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:14.916 17:20:25 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:14.916 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:14.916 17:20:25 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:14.916 17:20:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:14.916 [2024-07-15 17:20:26.028221] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:14.916 [2024-07-15 17:20:26.028287] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2717835 ] 00:08:14.916 [2024-07-15 17:20:26.120695] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.916 [2024-07-15 17:20:26.189003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.856 17:20:26 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:15.856 17:20:26 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:08:15.856 17:20:26 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:15.856 17:20:26 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:15.856 17:20:26 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:15.856 17:20:26 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.856 17:20:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:15.856 [2024-07-15 17:20:27.021828] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:15.856 [2024-07-15 17:20:27.021868] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:15.856 00:08:15.856 [2024-07-15 17:20:27.029824] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:15.856 [2024-07-15 17:20:27.029840] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:15.856 00:08:15.856 Malloc0 00:08:15.856 Malloc1 00:08:15.856 Malloc2 00:08:15.856 Malloc3 00:08:15.856 Malloc4 00:08:15.856 Malloc5 00:08:15.856 Malloc6 00:08:15.856 Malloc7 00:08:15.856 Malloc8 00:08:15.856 Malloc9 00:08:15.856 [2024-07-15 17:20:27.138334] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:15.856 [2024-07-15 17:20:27.138368] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:15.856 [2024-07-15 17:20:27.138379] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1637dd0 00:08:15.856 [2024-07-15 17:20:27.138386] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:15.856 [2024-07-15 17:20:27.139519] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:15.856 [2024-07-15 17:20:27.139537] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:15.856 TestPT 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:16.116 5000+0 records in 00:08:16.116 5000+0 records out 00:08:16.116 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0169718 s, 603 MB/s 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:16.116 AIO0 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.116 17:20:27 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:16.116 17:20:27 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.379 17:20:27 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:16.379 17:20:27 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:16.380 17:20:27 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "783c3a2b-3787-4c0e-ae55-7ff98877cdd7"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "783c3a2b-3787-4c0e-ae55-7ff98877cdd7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "dc95ede7-9d91-53ee-af3b-7462c144040e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dc95ede7-9d91-53ee-af3b-7462c144040e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "85d1209c-bb8c-546f-965c-641acdae5880"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "85d1209c-bb8c-546f-965c-641acdae5880",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "795144fb-92e6-5477-8203-8053ef758c82"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "795144fb-92e6-5477-8203-8053ef758c82",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "e184d899-974f-5879-a68f-675d999bbc1f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e184d899-974f-5879-a68f-675d999bbc1f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "149c8888-c44d-54de-8af3-ac288439ea7d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "149c8888-c44d-54de-8af3-ac288439ea7d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "ff218920-1d8c-5d68-ba2a-c1eb61dad054"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ff218920-1d8c-5d68-ba2a-c1eb61dad054",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "af5246d3-6b33-5225-913f-6bef82defa51"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "af5246d3-6b33-5225-913f-6bef82defa51",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "f30397eb-85f7-5b53-954e-370ceba49321"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f30397eb-85f7-5b53-954e-370ceba49321",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "52cb3836-d4c6-5071-ba8f-84d59c7e2f25"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "52cb3836-d4c6-5071-ba8f-84d59c7e2f25",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "3090cb68-a85b-53d8-8e8c-0673ad50a73e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3090cb68-a85b-53d8-8e8c-0673ad50a73e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "0b63feb1-8776-55c5-9b20-f1653ddf0d46"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0b63feb1-8776-55c5-9b20-f1653ddf0d46",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "a91821db-734c-4c67-8369-f9abbcd20eea"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a91821db-734c-4c67-8369-f9abbcd20eea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a91821db-734c-4c67-8369-f9abbcd20eea",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "bd4d7d55-dc2b-4446-9f17-4485d5e61322",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "e18e59c0-90d8-4eca-821d-797e7bdb60ff",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "2ea8aec2-9b67-40ef-8585-6bbd3f800281"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2ea8aec2-9b67-40ef-8585-6bbd3f800281",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2ea8aec2-9b67-40ef-8585-6bbd3f800281",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "0a632413-39e0-4a71-bcd7-5b0ef74a076c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "86bfda2e-b6f7-4bab-beb3-2eaea3f1bd21",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "c00f0d03-575e-498c-9995-5f5627569b01"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c00f0d03-575e-498c-9995-5f5627569b01",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c00f0d03-575e-498c-9995-5f5627569b01",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "e196ea9f-6a05-4ad8-b3fe-de8ad218f6f1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "27b52a6e-0b30-4809-b587-1a4fed8098c7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "6e28aca0-7f11-4811-b299-0b3569dec791"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "6e28aca0-7f11-4811-b299-0b3569dec791",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:16.380 17:20:27 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:16.380 17:20:27 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:16.380 17:20:27 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:16.380 17:20:27 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 2717835 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 2717835 ']' 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 2717835 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2717835 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2717835' 00:08:16.380 killing process with pid 2717835 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@967 -- # kill 2717835 00:08:16.380 17:20:27 blockdev_general -- common/autotest_common.sh@972 -- # wait 2717835 00:08:16.640 17:20:27 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:16.640 17:20:27 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:16.640 17:20:27 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:16.640 17:20:27 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.640 17:20:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:16.640 ************************************ 00:08:16.640 START TEST bdev_hello_world 00:08:16.640 ************************************ 00:08:16.640 17:20:27 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:16.640 [2024-07-15 17:20:27.882758] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:16.640 [2024-07-15 17:20:27.882805] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2718160 ] 00:08:16.901 [2024-07-15 17:20:27.969211] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.901 [2024-07-15 17:20:28.035337] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.901 [2024-07-15 17:20:28.159491] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:16.901 [2024-07-15 17:20:28.159537] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:16.901 [2024-07-15 17:20:28.159545] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:16.901 [2024-07-15 17:20:28.167497] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:16.901 [2024-07-15 17:20:28.167516] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:16.901 [2024-07-15 17:20:28.175511] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:16.901 [2024-07-15 17:20:28.175528] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:17.162 [2024-07-15 17:20:28.236208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:17.162 [2024-07-15 17:20:28.236247] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:17.162 [2024-07-15 17:20:28.236257] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb009f0 00:08:17.163 [2024-07-15 17:20:28.236263] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:17.163 [2024-07-15 17:20:28.237400] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:17.163 [2024-07-15 17:20:28.237420] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:17.163 [2024-07-15 17:20:28.368008] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:17.163 [2024-07-15 17:20:28.368049] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:17.163 [2024-07-15 17:20:28.368076] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:17.163 [2024-07-15 17:20:28.368116] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:17.163 [2024-07-15 17:20:28.368158] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:17.163 [2024-07-15 17:20:28.368170] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:17.163 [2024-07-15 17:20:28.368203] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:17.163 00:08:17.163 [2024-07-15 17:20:28.368219] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:17.424 00:08:17.424 real 0m0.724s 00:08:17.424 user 0m0.483s 00:08:17.424 sys 0m0.194s 00:08:17.424 17:20:28 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:17.424 17:20:28 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:17.424 ************************************ 00:08:17.424 END TEST bdev_hello_world 00:08:17.424 ************************************ 00:08:17.424 17:20:28 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:17.424 17:20:28 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:17.424 17:20:28 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:17.424 17:20:28 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.424 17:20:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:17.424 ************************************ 00:08:17.424 START TEST bdev_bounds 00:08:17.424 ************************************ 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2718457 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2718457' 00:08:17.424 Process bdevio pid: 2718457 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2718457 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2718457 ']' 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:17.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:17.424 17:20:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:17.424 [2024-07-15 17:20:28.660333] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:17.424 [2024-07-15 17:20:28.660389] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2718457 ] 00:08:17.685 [2024-07-15 17:20:28.756249] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:17.685 [2024-07-15 17:20:28.832647] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:17.685 [2024-07-15 17:20:28.832794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:17.685 [2024-07-15 17:20:28.832919] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.685 [2024-07-15 17:20:28.955058] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:17.685 [2024-07-15 17:20:28.955094] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:17.685 [2024-07-15 17:20:28.955101] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:17.685 [2024-07-15 17:20:28.963069] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:17.685 [2024-07-15 17:20:28.963087] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:17.685 [2024-07-15 17:20:28.971087] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:17.685 [2024-07-15 17:20:28.971103] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:17.945 [2024-07-15 17:20:29.032082] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:17.945 [2024-07-15 17:20:29.032118] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:17.945 [2024-07-15 17:20:29.032128] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdd5ff0 00:08:17.945 [2024-07-15 17:20:29.032134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:17.945 [2024-07-15 17:20:29.033298] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:17.945 [2024-07-15 17:20:29.033316] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:17.945 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:17.945 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:08:17.945 17:20:29 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:18.207 I/O targets: 00:08:18.207 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:18.207 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:18.207 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:18.207 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:18.207 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:18.207 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:18.207 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:18.207 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:18.207 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:18.207 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:18.207 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:18.207 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:18.207 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:18.207 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:18.207 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:18.207 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:18.207 00:08:18.207 00:08:18.207 CUnit - A unit testing framework for C - Version 2.1-3 00:08:18.207 http://cunit.sourceforge.net/ 00:08:18.207 00:08:18.207 00:08:18.207 Suite: bdevio tests on: AIO0 00:08:18.207 Test: blockdev write read block ...passed 00:08:18.207 Test: blockdev write zeroes read block ...passed 00:08:18.207 Test: blockdev write zeroes read no split ...passed 00:08:18.207 Test: blockdev write zeroes read split ...passed 00:08:18.207 Test: blockdev write zeroes read split partial ...passed 00:08:18.207 Test: blockdev reset ...passed 00:08:18.207 Test: blockdev write read 8 blocks ...passed 00:08:18.207 Test: blockdev write read size > 128k ...passed 00:08:18.207 Test: blockdev write read invalid size ...passed 00:08:18.207 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.207 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.207 Test: blockdev write read max offset ...passed 00:08:18.207 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.207 Test: blockdev writev readv 8 blocks ...passed 00:08:18.207 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.207 Test: blockdev writev readv block ...passed 00:08:18.207 Test: blockdev writev readv size > 128k ...passed 00:08:18.207 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.207 Test: blockdev comparev and writev ...passed 00:08:18.207 Test: blockdev nvme passthru rw ...passed 00:08:18.207 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.207 Test: blockdev nvme admin passthru ...passed 00:08:18.207 Test: blockdev copy ...passed 00:08:18.207 Suite: bdevio tests on: raid1 00:08:18.207 Test: blockdev write read block ...passed 00:08:18.207 Test: blockdev write zeroes read block ...passed 00:08:18.207 Test: blockdev write zeroes read no split ...passed 00:08:18.207 Test: blockdev write zeroes read split ...passed 00:08:18.207 Test: blockdev write zeroes read split partial ...passed 00:08:18.207 Test: blockdev reset ...passed 00:08:18.207 Test: blockdev write read 8 blocks ...passed 00:08:18.207 Test: blockdev write read size > 128k ...passed 00:08:18.207 Test: blockdev write read invalid size ...passed 00:08:18.207 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.207 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.207 Test: blockdev write read max offset ...passed 00:08:18.207 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.207 Test: blockdev writev readv 8 blocks ...passed 00:08:18.207 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.207 Test: blockdev writev readv block ...passed 00:08:18.207 Test: blockdev writev readv size > 128k ...passed 00:08:18.207 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.207 Test: blockdev comparev and writev ...passed 00:08:18.207 Test: blockdev nvme passthru rw ...passed 00:08:18.207 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.207 Test: blockdev nvme admin passthru ...passed 00:08:18.207 Test: blockdev copy ...passed 00:08:18.207 Suite: bdevio tests on: concat0 00:08:18.207 Test: blockdev write read block ...passed 00:08:18.207 Test: blockdev write zeroes read block ...passed 00:08:18.207 Test: blockdev write zeroes read no split ...passed 00:08:18.207 Test: blockdev write zeroes read split ...passed 00:08:18.207 Test: blockdev write zeroes read split partial ...passed 00:08:18.207 Test: blockdev reset ...passed 00:08:18.207 Test: blockdev write read 8 blocks ...passed 00:08:18.207 Test: blockdev write read size > 128k ...passed 00:08:18.207 Test: blockdev write read invalid size ...passed 00:08:18.207 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.207 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.207 Test: blockdev write read max offset ...passed 00:08:18.207 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.207 Test: blockdev writev readv 8 blocks ...passed 00:08:18.207 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.207 Test: blockdev writev readv block ...passed 00:08:18.207 Test: blockdev writev readv size > 128k ...passed 00:08:18.207 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.207 Test: blockdev comparev and writev ...passed 00:08:18.207 Test: blockdev nvme passthru rw ...passed 00:08:18.207 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.207 Test: blockdev nvme admin passthru ...passed 00:08:18.207 Test: blockdev copy ...passed 00:08:18.207 Suite: bdevio tests on: raid0 00:08:18.207 Test: blockdev write read block ...passed 00:08:18.207 Test: blockdev write zeroes read block ...passed 00:08:18.207 Test: blockdev write zeroes read no split ...passed 00:08:18.207 Test: blockdev write zeroes read split ...passed 00:08:18.207 Test: blockdev write zeroes read split partial ...passed 00:08:18.207 Test: blockdev reset ...passed 00:08:18.207 Test: blockdev write read 8 blocks ...passed 00:08:18.207 Test: blockdev write read size > 128k ...passed 00:08:18.207 Test: blockdev write read invalid size ...passed 00:08:18.207 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.207 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.207 Test: blockdev write read max offset ...passed 00:08:18.207 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.207 Test: blockdev writev readv 8 blocks ...passed 00:08:18.207 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.207 Test: blockdev writev readv block ...passed 00:08:18.207 Test: blockdev writev readv size > 128k ...passed 00:08:18.207 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.207 Test: blockdev comparev and writev ...passed 00:08:18.207 Test: blockdev nvme passthru rw ...passed 00:08:18.207 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.207 Test: blockdev nvme admin passthru ...passed 00:08:18.207 Test: blockdev copy ...passed 00:08:18.207 Suite: bdevio tests on: TestPT 00:08:18.207 Test: blockdev write read block ...passed 00:08:18.207 Test: blockdev write zeroes read block ...passed 00:08:18.207 Test: blockdev write zeroes read no split ...passed 00:08:18.207 Test: blockdev write zeroes read split ...passed 00:08:18.207 Test: blockdev write zeroes read split partial ...passed 00:08:18.208 Test: blockdev reset ...passed 00:08:18.208 Test: blockdev write read 8 blocks ...passed 00:08:18.208 Test: blockdev write read size > 128k ...passed 00:08:18.208 Test: blockdev write read invalid size ...passed 00:08:18.208 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.208 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.208 Test: blockdev write read max offset ...passed 00:08:18.208 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.208 Test: blockdev writev readv 8 blocks ...passed 00:08:18.208 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.208 Test: blockdev writev readv block ...passed 00:08:18.208 Test: blockdev writev readv size > 128k ...passed 00:08:18.208 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.208 Test: blockdev comparev and writev ...passed 00:08:18.208 Test: blockdev nvme passthru rw ...passed 00:08:18.208 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.208 Test: blockdev nvme admin passthru ...passed 00:08:18.208 Test: blockdev copy ...passed 00:08:18.208 Suite: bdevio tests on: Malloc2p7 00:08:18.208 Test: blockdev write read block ...passed 00:08:18.208 Test: blockdev write zeroes read block ...passed 00:08:18.208 Test: blockdev write zeroes read no split ...passed 00:08:18.208 Test: blockdev write zeroes read split ...passed 00:08:18.208 Test: blockdev write zeroes read split partial ...passed 00:08:18.208 Test: blockdev reset ...passed 00:08:18.208 Test: blockdev write read 8 blocks ...passed 00:08:18.208 Test: blockdev write read size > 128k ...passed 00:08:18.208 Test: blockdev write read invalid size ...passed 00:08:18.208 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.208 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.208 Test: blockdev write read max offset ...passed 00:08:18.208 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.208 Test: blockdev writev readv 8 blocks ...passed 00:08:18.208 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.208 Test: blockdev writev readv block ...passed 00:08:18.208 Test: blockdev writev readv size > 128k ...passed 00:08:18.208 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.208 Test: blockdev comparev and writev ...passed 00:08:18.208 Test: blockdev nvme passthru rw ...passed 00:08:18.208 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.208 Test: blockdev nvme admin passthru ...passed 00:08:18.208 Test: blockdev copy ...passed 00:08:18.208 Suite: bdevio tests on: Malloc2p6 00:08:18.208 Test: blockdev write read block ...passed 00:08:18.208 Test: blockdev write zeroes read block ...passed 00:08:18.208 Test: blockdev write zeroes read no split ...passed 00:08:18.208 Test: blockdev write zeroes read split ...passed 00:08:18.208 Test: blockdev write zeroes read split partial ...passed 00:08:18.208 Test: blockdev reset ...passed 00:08:18.208 Test: blockdev write read 8 blocks ...passed 00:08:18.208 Test: blockdev write read size > 128k ...passed 00:08:18.208 Test: blockdev write read invalid size ...passed 00:08:18.208 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.208 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.208 Test: blockdev write read max offset ...passed 00:08:18.208 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.208 Test: blockdev writev readv 8 blocks ...passed 00:08:18.208 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.208 Test: blockdev writev readv block ...passed 00:08:18.208 Test: blockdev writev readv size > 128k ...passed 00:08:18.208 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.208 Test: blockdev comparev and writev ...passed 00:08:18.208 Test: blockdev nvme passthru rw ...passed 00:08:18.208 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.208 Test: blockdev nvme admin passthru ...passed 00:08:18.208 Test: blockdev copy ...passed 00:08:18.208 Suite: bdevio tests on: Malloc2p5 00:08:18.208 Test: blockdev write read block ...passed 00:08:18.208 Test: blockdev write zeroes read block ...passed 00:08:18.208 Test: blockdev write zeroes read no split ...passed 00:08:18.208 Test: blockdev write zeroes read split ...passed 00:08:18.208 Test: blockdev write zeroes read split partial ...passed 00:08:18.208 Test: blockdev reset ...passed 00:08:18.208 Test: blockdev write read 8 blocks ...passed 00:08:18.208 Test: blockdev write read size > 128k ...passed 00:08:18.208 Test: blockdev write read invalid size ...passed 00:08:18.208 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.208 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.208 Test: blockdev write read max offset ...passed 00:08:18.208 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.208 Test: blockdev writev readv 8 blocks ...passed 00:08:18.208 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.208 Test: blockdev writev readv block ...passed 00:08:18.208 Test: blockdev writev readv size > 128k ...passed 00:08:18.208 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.208 Test: blockdev comparev and writev ...passed 00:08:18.208 Test: blockdev nvme passthru rw ...passed 00:08:18.208 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.208 Test: blockdev nvme admin passthru ...passed 00:08:18.208 Test: blockdev copy ...passed 00:08:18.208 Suite: bdevio tests on: Malloc2p4 00:08:18.208 Test: blockdev write read block ...passed 00:08:18.208 Test: blockdev write zeroes read block ...passed 00:08:18.208 Test: blockdev write zeroes read no split ...passed 00:08:18.208 Test: blockdev write zeroes read split ...passed 00:08:18.208 Test: blockdev write zeroes read split partial ...passed 00:08:18.208 Test: blockdev reset ...passed 00:08:18.208 Test: blockdev write read 8 blocks ...passed 00:08:18.208 Test: blockdev write read size > 128k ...passed 00:08:18.208 Test: blockdev write read invalid size ...passed 00:08:18.208 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.208 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.208 Test: blockdev write read max offset ...passed 00:08:18.208 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.208 Test: blockdev writev readv 8 blocks ...passed 00:08:18.208 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.208 Test: blockdev writev readv block ...passed 00:08:18.208 Test: blockdev writev readv size > 128k ...passed 00:08:18.208 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.208 Test: blockdev comparev and writev ...passed 00:08:18.208 Test: blockdev nvme passthru rw ...passed 00:08:18.208 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.208 Test: blockdev nvme admin passthru ...passed 00:08:18.208 Test: blockdev copy ...passed 00:08:18.208 Suite: bdevio tests on: Malloc2p3 00:08:18.208 Test: blockdev write read block ...passed 00:08:18.208 Test: blockdev write zeroes read block ...passed 00:08:18.208 Test: blockdev write zeroes read no split ...passed 00:08:18.208 Test: blockdev write zeroes read split ...passed 00:08:18.208 Test: blockdev write zeroes read split partial ...passed 00:08:18.208 Test: blockdev reset ...passed 00:08:18.208 Test: blockdev write read 8 blocks ...passed 00:08:18.208 Test: blockdev write read size > 128k ...passed 00:08:18.208 Test: blockdev write read invalid size ...passed 00:08:18.208 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.208 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.208 Test: blockdev write read max offset ...passed 00:08:18.208 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.208 Test: blockdev writev readv 8 blocks ...passed 00:08:18.208 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.208 Test: blockdev writev readv block ...passed 00:08:18.208 Test: blockdev writev readv size > 128k ...passed 00:08:18.470 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.470 Test: blockdev comparev and writev ...passed 00:08:18.470 Test: blockdev nvme passthru rw ...passed 00:08:18.470 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.470 Test: blockdev nvme admin passthru ...passed 00:08:18.470 Test: blockdev copy ...passed 00:08:18.470 Suite: bdevio tests on: Malloc2p2 00:08:18.470 Test: blockdev write read block ...passed 00:08:18.470 Test: blockdev write zeroes read block ...passed 00:08:18.470 Test: blockdev write zeroes read no split ...passed 00:08:18.470 Test: blockdev write zeroes read split ...passed 00:08:18.470 Test: blockdev write zeroes read split partial ...passed 00:08:18.470 Test: blockdev reset ...passed 00:08:18.470 Test: blockdev write read 8 blocks ...passed 00:08:18.470 Test: blockdev write read size > 128k ...passed 00:08:18.470 Test: blockdev write read invalid size ...passed 00:08:18.470 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.470 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.470 Test: blockdev write read max offset ...passed 00:08:18.470 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.470 Test: blockdev writev readv 8 blocks ...passed 00:08:18.470 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.470 Test: blockdev writev readv block ...passed 00:08:18.470 Test: blockdev writev readv size > 128k ...passed 00:08:18.470 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.470 Test: blockdev comparev and writev ...passed 00:08:18.470 Test: blockdev nvme passthru rw ...passed 00:08:18.470 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.470 Test: blockdev nvme admin passthru ...passed 00:08:18.470 Test: blockdev copy ...passed 00:08:18.470 Suite: bdevio tests on: Malloc2p1 00:08:18.470 Test: blockdev write read block ...passed 00:08:18.470 Test: blockdev write zeroes read block ...passed 00:08:18.470 Test: blockdev write zeroes read no split ...passed 00:08:18.470 Test: blockdev write zeroes read split ...passed 00:08:18.470 Test: blockdev write zeroes read split partial ...passed 00:08:18.470 Test: blockdev reset ...passed 00:08:18.470 Test: blockdev write read 8 blocks ...passed 00:08:18.470 Test: blockdev write read size > 128k ...passed 00:08:18.470 Test: blockdev write read invalid size ...passed 00:08:18.470 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.470 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.470 Test: blockdev write read max offset ...passed 00:08:18.470 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.470 Test: blockdev writev readv 8 blocks ...passed 00:08:18.470 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.470 Test: blockdev writev readv block ...passed 00:08:18.470 Test: blockdev writev readv size > 128k ...passed 00:08:18.470 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.470 Test: blockdev comparev and writev ...passed 00:08:18.470 Test: blockdev nvme passthru rw ...passed 00:08:18.470 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.470 Test: blockdev nvme admin passthru ...passed 00:08:18.470 Test: blockdev copy ...passed 00:08:18.470 Suite: bdevio tests on: Malloc2p0 00:08:18.470 Test: blockdev write read block ...passed 00:08:18.470 Test: blockdev write zeroes read block ...passed 00:08:18.470 Test: blockdev write zeroes read no split ...passed 00:08:18.470 Test: blockdev write zeroes read split ...passed 00:08:18.470 Test: blockdev write zeroes read split partial ...passed 00:08:18.470 Test: blockdev reset ...passed 00:08:18.470 Test: blockdev write read 8 blocks ...passed 00:08:18.470 Test: blockdev write read size > 128k ...passed 00:08:18.470 Test: blockdev write read invalid size ...passed 00:08:18.470 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.470 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.470 Test: blockdev write read max offset ...passed 00:08:18.470 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.470 Test: blockdev writev readv 8 blocks ...passed 00:08:18.470 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.470 Test: blockdev writev readv block ...passed 00:08:18.470 Test: blockdev writev readv size > 128k ...passed 00:08:18.470 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.470 Test: blockdev comparev and writev ...passed 00:08:18.470 Test: blockdev nvme passthru rw ...passed 00:08:18.470 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.470 Test: blockdev nvme admin passthru ...passed 00:08:18.470 Test: blockdev copy ...passed 00:08:18.470 Suite: bdevio tests on: Malloc1p1 00:08:18.470 Test: blockdev write read block ...passed 00:08:18.470 Test: blockdev write zeroes read block ...passed 00:08:18.470 Test: blockdev write zeroes read no split ...passed 00:08:18.470 Test: blockdev write zeroes read split ...passed 00:08:18.470 Test: blockdev write zeroes read split partial ...passed 00:08:18.470 Test: blockdev reset ...passed 00:08:18.470 Test: blockdev write read 8 blocks ...passed 00:08:18.470 Test: blockdev write read size > 128k ...passed 00:08:18.470 Test: blockdev write read invalid size ...passed 00:08:18.470 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.470 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.470 Test: blockdev write read max offset ...passed 00:08:18.470 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.470 Test: blockdev writev readv 8 blocks ...passed 00:08:18.470 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.470 Test: blockdev writev readv block ...passed 00:08:18.470 Test: blockdev writev readv size > 128k ...passed 00:08:18.470 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.470 Test: blockdev comparev and writev ...passed 00:08:18.470 Test: blockdev nvme passthru rw ...passed 00:08:18.470 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.470 Test: blockdev nvme admin passthru ...passed 00:08:18.470 Test: blockdev copy ...passed 00:08:18.470 Suite: bdevio tests on: Malloc1p0 00:08:18.470 Test: blockdev write read block ...passed 00:08:18.470 Test: blockdev write zeroes read block ...passed 00:08:18.470 Test: blockdev write zeroes read no split ...passed 00:08:18.470 Test: blockdev write zeroes read split ...passed 00:08:18.470 Test: blockdev write zeroes read split partial ...passed 00:08:18.470 Test: blockdev reset ...passed 00:08:18.470 Test: blockdev write read 8 blocks ...passed 00:08:18.470 Test: blockdev write read size > 128k ...passed 00:08:18.470 Test: blockdev write read invalid size ...passed 00:08:18.470 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.470 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.470 Test: blockdev write read max offset ...passed 00:08:18.470 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.470 Test: blockdev writev readv 8 blocks ...passed 00:08:18.470 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.470 Test: blockdev writev readv block ...passed 00:08:18.470 Test: blockdev writev readv size > 128k ...passed 00:08:18.470 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.470 Test: blockdev comparev and writev ...passed 00:08:18.470 Test: blockdev nvme passthru rw ...passed 00:08:18.470 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.470 Test: blockdev nvme admin passthru ...passed 00:08:18.470 Test: blockdev copy ...passed 00:08:18.470 Suite: bdevio tests on: Malloc0 00:08:18.470 Test: blockdev write read block ...passed 00:08:18.470 Test: blockdev write zeroes read block ...passed 00:08:18.470 Test: blockdev write zeroes read no split ...passed 00:08:18.470 Test: blockdev write zeroes read split ...passed 00:08:18.470 Test: blockdev write zeroes read split partial ...passed 00:08:18.470 Test: blockdev reset ...passed 00:08:18.470 Test: blockdev write read 8 blocks ...passed 00:08:18.470 Test: blockdev write read size > 128k ...passed 00:08:18.470 Test: blockdev write read invalid size ...passed 00:08:18.470 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:18.470 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:18.470 Test: blockdev write read max offset ...passed 00:08:18.470 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:18.470 Test: blockdev writev readv 8 blocks ...passed 00:08:18.470 Test: blockdev writev readv 30 x 1block ...passed 00:08:18.470 Test: blockdev writev readv block ...passed 00:08:18.470 Test: blockdev writev readv size > 128k ...passed 00:08:18.470 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:18.470 Test: blockdev comparev and writev ...passed 00:08:18.471 Test: blockdev nvme passthru rw ...passed 00:08:18.471 Test: blockdev nvme passthru vendor specific ...passed 00:08:18.471 Test: blockdev nvme admin passthru ...passed 00:08:18.471 Test: blockdev copy ...passed 00:08:18.471 00:08:18.471 Run Summary: Type Total Ran Passed Failed Inactive 00:08:18.471 suites 16 16 n/a 0 0 00:08:18.471 tests 368 368 368 0 0 00:08:18.471 asserts 2224 2224 2224 0 n/a 00:08:18.471 00:08:18.471 Elapsed time = 0.626 seconds 00:08:18.471 0 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2718457 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2718457 ']' 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2718457 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2718457 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2718457' 00:08:18.471 killing process with pid 2718457 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2718457 00:08:18.471 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2718457 00:08:18.732 17:20:29 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:18.732 00:08:18.732 real 0m1.207s 00:08:18.732 user 0m3.141s 00:08:18.732 sys 0m0.348s 00:08:18.732 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:18.732 17:20:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:18.732 ************************************ 00:08:18.732 END TEST bdev_bounds 00:08:18.732 ************************************ 00:08:18.732 17:20:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:18.732 17:20:29 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:18.732 17:20:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:18.732 17:20:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:18.732 17:20:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:18.732 ************************************ 00:08:18.732 START TEST bdev_nbd 00:08:18.732 ************************************ 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2718642 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2718642 /var/tmp/spdk-nbd.sock 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:18.732 17:20:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2718642 ']' 00:08:18.733 17:20:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:18.733 17:20:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:18.733 17:20:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:18.733 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:18.733 17:20:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:18.733 17:20:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:18.733 [2024-07-15 17:20:29.959078] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:08:18.733 [2024-07-15 17:20:29.959128] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:18.994 [2024-07-15 17:20:30.050354] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.994 [2024-07-15 17:20:30.123365] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.994 [2024-07-15 17:20:30.247458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:18.994 [2024-07-15 17:20:30.247496] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:18.994 [2024-07-15 17:20:30.247504] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:18.994 [2024-07-15 17:20:30.255466] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:18.994 [2024-07-15 17:20:30.255483] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:18.994 [2024-07-15 17:20:30.263480] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:18.994 [2024-07-15 17:20:30.263496] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:19.254 [2024-07-15 17:20:30.324167] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:19.254 [2024-07-15 17:20:30.324205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:19.254 [2024-07-15 17:20:30.324215] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x102da00 00:08:19.254 [2024-07-15 17:20:30.324221] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:19.254 [2024-07-15 17:20:30.325354] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:19.255 [2024-07-15 17:20:30.325373] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:19.516 17:20:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:19.777 1+0 records in 00:08:19.777 1+0 records out 00:08:19.777 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271586 s, 15.1 MB/s 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:19.777 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:20.036 1+0 records in 00:08:20.036 1+0 records out 00:08:20.036 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288844 s, 14.2 MB/s 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.036 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:20.037 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.037 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:20.037 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:20.037 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:20.037 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:20.037 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:20.296 1+0 records in 00:08:20.296 1+0 records out 00:08:20.296 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287923 s, 14.2 MB/s 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:20.296 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:20.556 1+0 records in 00:08:20.556 1+0 records out 00:08:20.556 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000205327 s, 19.9 MB/s 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:20.556 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:20.817 1+0 records in 00:08:20.817 1+0 records out 00:08:20.817 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000302335 s, 13.5 MB/s 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:20.817 17:20:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:21.119 1+0 records in 00:08:21.119 1+0 records out 00:08:21.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000303417 s, 13.5 MB/s 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:21.119 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:21.412 1+0 records in 00:08:21.412 1+0 records out 00:08:21.412 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000293192 s, 14.0 MB/s 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:21.412 1+0 records in 00:08:21.412 1+0 records out 00:08:21.412 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000397798 s, 10.3 MB/s 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:21.412 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:21.672 1+0 records in 00:08:21.672 1+0 records out 00:08:21.672 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000374934 s, 10.9 MB/s 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:21.672 17:20:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:21.932 1+0 records in 00:08:21.932 1+0 records out 00:08:21.932 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000353133 s, 11.6 MB/s 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:21.932 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:22.500 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:22.500 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:22.500 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:22.500 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:22.501 1+0 records in 00:08:22.501 1+0 records out 00:08:22.501 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000349464 s, 11.7 MB/s 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:22.501 17:20:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:23.069 1+0 records in 00:08:23.069 1+0 records out 00:08:23.069 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000358975 s, 11.4 MB/s 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:23.069 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:23.329 1+0 records in 00:08:23.329 1+0 records out 00:08:23.329 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000500172 s, 8.2 MB/s 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:23.329 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:23.589 1+0 records in 00:08:23.589 1+0 records out 00:08:23.589 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000477754 s, 8.6 MB/s 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:23.589 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:23.849 1+0 records in 00:08:23.849 1+0 records out 00:08:23.849 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0005381 s, 7.6 MB/s 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:23.849 17:20:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:23.849 1+0 records in 00:08:23.849 1+0 records out 00:08:23.849 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000507271 s, 8.1 MB/s 00:08:23.849 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd0", 00:08:24.108 "bdev_name": "Malloc0" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd1", 00:08:24.108 "bdev_name": "Malloc1p0" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd2", 00:08:24.108 "bdev_name": "Malloc1p1" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd3", 00:08:24.108 "bdev_name": "Malloc2p0" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd4", 00:08:24.108 "bdev_name": "Malloc2p1" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd5", 00:08:24.108 "bdev_name": "Malloc2p2" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd6", 00:08:24.108 "bdev_name": "Malloc2p3" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd7", 00:08:24.108 "bdev_name": "Malloc2p4" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd8", 00:08:24.108 "bdev_name": "Malloc2p5" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd9", 00:08:24.108 "bdev_name": "Malloc2p6" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd10", 00:08:24.108 "bdev_name": "Malloc2p7" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd11", 00:08:24.108 "bdev_name": "TestPT" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd12", 00:08:24.108 "bdev_name": "raid0" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd13", 00:08:24.108 "bdev_name": "concat0" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd14", 00:08:24.108 "bdev_name": "raid1" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd15", 00:08:24.108 "bdev_name": "AIO0" 00:08:24.108 } 00:08:24.108 ]' 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:24.108 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd0", 00:08:24.108 "bdev_name": "Malloc0" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd1", 00:08:24.108 "bdev_name": "Malloc1p0" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd2", 00:08:24.108 "bdev_name": "Malloc1p1" 00:08:24.108 }, 00:08:24.108 { 00:08:24.108 "nbd_device": "/dev/nbd3", 00:08:24.109 "bdev_name": "Malloc2p0" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd4", 00:08:24.109 "bdev_name": "Malloc2p1" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd5", 00:08:24.109 "bdev_name": "Malloc2p2" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd6", 00:08:24.109 "bdev_name": "Malloc2p3" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd7", 00:08:24.109 "bdev_name": "Malloc2p4" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd8", 00:08:24.109 "bdev_name": "Malloc2p5" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd9", 00:08:24.109 "bdev_name": "Malloc2p6" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd10", 00:08:24.109 "bdev_name": "Malloc2p7" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd11", 00:08:24.109 "bdev_name": "TestPT" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd12", 00:08:24.109 "bdev_name": "raid0" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd13", 00:08:24.109 "bdev_name": "concat0" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd14", 00:08:24.109 "bdev_name": "raid1" 00:08:24.109 }, 00:08:24.109 { 00:08:24.109 "nbd_device": "/dev/nbd15", 00:08:24.109 "bdev_name": "AIO0" 00:08:24.109 } 00:08:24.109 ]' 00:08:24.109 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:24.109 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:24.109 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:24.109 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:24.109 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:24.109 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:24.109 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:24.109 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:24.368 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:24.627 17:20:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:24.887 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:24.887 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:24.887 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:24.887 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:24.887 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:24.887 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:24.887 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:24.887 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:24.888 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:24.888 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:25.149 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:25.409 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:25.669 17:20:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:25.929 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:26.190 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:26.449 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:26.708 17:20:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:26.968 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:27.538 17:20:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:27.799 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:28.060 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:28.320 /dev/nbd0 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.320 1+0 records in 00:08:28.320 1+0 records out 00:08:28.320 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212471 s, 19.3 MB/s 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:28.320 17:20:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:28.890 /dev/nbd1 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.890 1+0 records in 00:08:28.890 1+0 records out 00:08:28.890 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307899 s, 13.3 MB/s 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:28.890 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:29.150 /dev/nbd10 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.151 1+0 records in 00:08:29.151 1+0 records out 00:08:29.151 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285237 s, 14.4 MB/s 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:29.151 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:29.412 /dev/nbd11 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.412 1+0 records in 00:08:29.412 1+0 records out 00:08:29.412 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00019797 s, 20.7 MB/s 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:29.412 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:29.672 /dev/nbd12 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.672 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.673 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.673 1+0 records in 00:08:29.673 1+0 records out 00:08:29.673 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269384 s, 15.2 MB/s 00:08:29.673 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.673 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.673 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.673 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.673 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.673 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:29.673 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:29.673 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:29.933 /dev/nbd13 00:08:29.933 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:29.933 17:20:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:29.933 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:29.933 17:20:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.933 1+0 records in 00:08:29.933 1+0 records out 00:08:29.933 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274237 s, 14.9 MB/s 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:29.933 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:29.933 /dev/nbd14 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.193 1+0 records in 00:08:30.193 1+0 records out 00:08:30.193 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286073 s, 14.3 MB/s 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:30.193 /dev/nbd15 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.193 1+0 records in 00:08:30.193 1+0 records out 00:08:30.193 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289227 s, 14.2 MB/s 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:30.193 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:30.454 /dev/nbd2 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.454 1+0 records in 00:08:30.454 1+0 records out 00:08:30.454 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268111 s, 15.3 MB/s 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:30.454 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:30.714 /dev/nbd3 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.714 1+0 records in 00:08:30.714 1+0 records out 00:08:30.714 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000372711 s, 11.0 MB/s 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:30.714 17:20:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:30.974 /dev/nbd4 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.974 1+0 records in 00:08:30.974 1+0 records out 00:08:30.974 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000336519 s, 12.2 MB/s 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:30.974 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:31.234 /dev/nbd5 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:31.234 1+0 records in 00:08:31.234 1+0 records out 00:08:31.234 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306173 s, 13.4 MB/s 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:31.234 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:08:31.494 /dev/nbd6 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:31.494 1+0 records in 00:08:31.494 1+0 records out 00:08:31.494 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000536806 s, 7.6 MB/s 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:31.494 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:08:31.755 /dev/nbd7 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:31.755 1+0 records in 00:08:31.755 1+0 records out 00:08:31.755 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000476278 s, 8.6 MB/s 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:31.755 17:20:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:08:31.755 /dev/nbd8 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:32.016 1+0 records in 00:08:32.016 1+0 records out 00:08:32.016 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260273 s, 15.7 MB/s 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:08:32.016 /dev/nbd9 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:32.016 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:32.017 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:32.277 1+0 records in 00:08:32.277 1+0 records out 00:08:32.277 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000487128 s, 8.4 MB/s 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:32.277 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:32.277 { 00:08:32.277 "nbd_device": "/dev/nbd0", 00:08:32.277 "bdev_name": "Malloc0" 00:08:32.277 }, 00:08:32.277 { 00:08:32.277 "nbd_device": "/dev/nbd1", 00:08:32.277 "bdev_name": "Malloc1p0" 00:08:32.277 }, 00:08:32.277 { 00:08:32.277 "nbd_device": "/dev/nbd10", 00:08:32.277 "bdev_name": "Malloc1p1" 00:08:32.277 }, 00:08:32.277 { 00:08:32.277 "nbd_device": "/dev/nbd11", 00:08:32.277 "bdev_name": "Malloc2p0" 00:08:32.277 }, 00:08:32.277 { 00:08:32.277 "nbd_device": "/dev/nbd12", 00:08:32.277 "bdev_name": "Malloc2p1" 00:08:32.277 }, 00:08:32.277 { 00:08:32.277 "nbd_device": "/dev/nbd13", 00:08:32.277 "bdev_name": "Malloc2p2" 00:08:32.277 }, 00:08:32.277 { 00:08:32.277 "nbd_device": "/dev/nbd14", 00:08:32.277 "bdev_name": "Malloc2p3" 00:08:32.277 }, 00:08:32.277 { 00:08:32.277 "nbd_device": "/dev/nbd15", 00:08:32.278 "bdev_name": "Malloc2p4" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd2", 00:08:32.278 "bdev_name": "Malloc2p5" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd3", 00:08:32.278 "bdev_name": "Malloc2p6" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd4", 00:08:32.278 "bdev_name": "Malloc2p7" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd5", 00:08:32.278 "bdev_name": "TestPT" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd6", 00:08:32.278 "bdev_name": "raid0" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd7", 00:08:32.278 "bdev_name": "concat0" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd8", 00:08:32.278 "bdev_name": "raid1" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd9", 00:08:32.278 "bdev_name": "AIO0" 00:08:32.278 } 00:08:32.278 ]' 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd0", 00:08:32.278 "bdev_name": "Malloc0" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd1", 00:08:32.278 "bdev_name": "Malloc1p0" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd10", 00:08:32.278 "bdev_name": "Malloc1p1" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd11", 00:08:32.278 "bdev_name": "Malloc2p0" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd12", 00:08:32.278 "bdev_name": "Malloc2p1" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd13", 00:08:32.278 "bdev_name": "Malloc2p2" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd14", 00:08:32.278 "bdev_name": "Malloc2p3" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd15", 00:08:32.278 "bdev_name": "Malloc2p4" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd2", 00:08:32.278 "bdev_name": "Malloc2p5" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd3", 00:08:32.278 "bdev_name": "Malloc2p6" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd4", 00:08:32.278 "bdev_name": "Malloc2p7" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd5", 00:08:32.278 "bdev_name": "TestPT" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd6", 00:08:32.278 "bdev_name": "raid0" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd7", 00:08:32.278 "bdev_name": "concat0" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd8", 00:08:32.278 "bdev_name": "raid1" 00:08:32.278 }, 00:08:32.278 { 00:08:32.278 "nbd_device": "/dev/nbd9", 00:08:32.278 "bdev_name": "AIO0" 00:08:32.278 } 00:08:32.278 ]' 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:32.278 /dev/nbd1 00:08:32.278 /dev/nbd10 00:08:32.278 /dev/nbd11 00:08:32.278 /dev/nbd12 00:08:32.278 /dev/nbd13 00:08:32.278 /dev/nbd14 00:08:32.278 /dev/nbd15 00:08:32.278 /dev/nbd2 00:08:32.278 /dev/nbd3 00:08:32.278 /dev/nbd4 00:08:32.278 /dev/nbd5 00:08:32.278 /dev/nbd6 00:08:32.278 /dev/nbd7 00:08:32.278 /dev/nbd8 00:08:32.278 /dev/nbd9' 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:32.278 /dev/nbd1 00:08:32.278 /dev/nbd10 00:08:32.278 /dev/nbd11 00:08:32.278 /dev/nbd12 00:08:32.278 /dev/nbd13 00:08:32.278 /dev/nbd14 00:08:32.278 /dev/nbd15 00:08:32.278 /dev/nbd2 00:08:32.278 /dev/nbd3 00:08:32.278 /dev/nbd4 00:08:32.278 /dev/nbd5 00:08:32.278 /dev/nbd6 00:08:32.278 /dev/nbd7 00:08:32.278 /dev/nbd8 00:08:32.278 /dev/nbd9' 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:32.278 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:08:32.537 256+0 records in 00:08:32.537 256+0 records out 00:08:32.537 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0120179 s, 87.3 MB/s 00:08:32.537 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.537 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:32.537 256+0 records in 00:08:32.537 256+0 records out 00:08:32.537 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0889381 s, 11.8 MB/s 00:08:32.537 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.537 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:32.537 256+0 records in 00:08:32.537 256+0 records out 00:08:32.537 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0998011 s, 10.5 MB/s 00:08:32.537 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.537 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:08:32.797 256+0 records in 00:08:32.797 256+0 records out 00:08:32.797 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0950372 s, 11.0 MB/s 00:08:32.797 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.797 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:08:32.797 256+0 records in 00:08:32.797 256+0 records out 00:08:32.797 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0897778 s, 11.7 MB/s 00:08:32.797 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.797 17:20:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:08:32.797 256+0 records in 00:08:32.797 256+0 records out 00:08:32.797 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0847053 s, 12.4 MB/s 00:08:32.797 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.797 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:08:33.057 256+0 records in 00:08:33.057 256+0 records out 00:08:33.057 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0859696 s, 12.2 MB/s 00:08:33.057 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.057 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:08:33.057 256+0 records in 00:08:33.057 256+0 records out 00:08:33.057 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0854332 s, 12.3 MB/s 00:08:33.057 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.057 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:08:33.057 256+0 records in 00:08:33.057 256+0 records out 00:08:33.057 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0851342 s, 12.3 MB/s 00:08:33.057 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.057 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:08:33.317 256+0 records in 00:08:33.317 256+0 records out 00:08:33.317 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0852063 s, 12.3 MB/s 00:08:33.317 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.317 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:08:33.318 256+0 records in 00:08:33.318 256+0 records out 00:08:33.318 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0871469 s, 12.0 MB/s 00:08:33.318 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.318 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:08:33.318 256+0 records in 00:08:33.318 256+0 records out 00:08:33.318 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0879568 s, 11.9 MB/s 00:08:33.318 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.318 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:08:33.578 256+0 records in 00:08:33.578 256+0 records out 00:08:33.578 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0872427 s, 12.0 MB/s 00:08:33.578 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.578 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:08:33.578 256+0 records in 00:08:33.578 256+0 records out 00:08:33.578 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0894787 s, 11.7 MB/s 00:08:33.578 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.578 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:08:33.838 256+0 records in 00:08:33.838 256+0 records out 00:08:33.838 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0891189 s, 11.8 MB/s 00:08:33.838 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:08:33.838 256+0 records in 00:08:33.838 256+0 records out 00:08:33.838 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0967899 s, 10.8 MB/s 00:08:33.838 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:08:33.838 256+0 records in 00:08:33.838 256+0 records out 00:08:33.838 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0852633 s, 12.3 MB/s 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.838 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.100 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.361 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:34.621 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:34.621 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:34.621 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:34.621 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.622 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.622 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:34.622 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.622 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.622 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.622 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:34.882 17:20:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:34.882 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:34.882 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:34.882 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.882 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.882 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:34.882 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.882 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.882 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.882 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.172 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.479 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.739 17:20:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:35.739 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:35.739 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:35.739 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:35.739 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.739 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.739 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.000 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.260 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.520 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.781 17:20:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:37.042 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:37.302 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:37.563 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:08:37.823 17:20:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:08:38.392 malloc_lvol_verify 00:08:38.392 17:20:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:08:38.962 40f48d43-713e-4f2a-8279-81bb675441c3 00:08:38.962 17:20:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:08:39.222 38804edb-7501-474d-bcec-b261d53d1e0b 00:08:39.482 17:20:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:08:40.053 /dev/nbd0 00:08:40.053 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:08:40.053 mke2fs 1.46.5 (30-Dec-2021) 00:08:40.053 Discarding device blocks: 0/4096 done 00:08:40.053 Creating filesystem with 4096 1k blocks and 1024 inodes 00:08:40.053 00:08:40.053 Allocating group tables: 0/1 done 00:08:40.053 Writing inode tables: 0/1 done 00:08:40.053 Creating journal (1024 blocks): done 00:08:40.053 Writing superblocks and filesystem accounting information: 0/1 done 00:08:40.053 00:08:40.053 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2718642 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2718642 ']' 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2718642 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:40.054 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2718642 00:08:40.313 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:40.314 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:40.314 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2718642' 00:08:40.314 killing process with pid 2718642 00:08:40.314 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2718642 00:08:40.314 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2718642 00:08:40.314 17:20:51 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:08:40.314 00:08:40.314 real 0m21.635s 00:08:40.314 user 0m31.250s 00:08:40.314 sys 0m8.436s 00:08:40.314 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:40.314 17:20:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:40.314 ************************************ 00:08:40.314 END TEST bdev_nbd 00:08:40.314 ************************************ 00:08:40.314 17:20:51 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:40.314 17:20:51 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:08:40.314 17:20:51 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:08:40.314 17:20:51 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:08:40.314 17:20:51 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:08:40.314 17:20:51 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:40.314 17:20:51 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:40.314 17:20:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:40.314 ************************************ 00:08:40.314 START TEST bdev_fio 00:08:40.314 ************************************ 00:08:40.314 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:08:40.314 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:08:40.314 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:40.314 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:40.314 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:08:40.314 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:08:40.314 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:08:40.576 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:08:40.577 17:20:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:40.577 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:40.577 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:40.577 17:20:51 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:40.577 ************************************ 00:08:40.577 START TEST bdev_fio_rw_verify 00:08:40.577 ************************************ 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:40.577 17:20:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:40.838 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.838 fio-3.35 00:08:40.838 Starting 16 threads 00:08:53.064 00:08:53.065 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=2723199: Mon Jul 15 17:21:02 2024 00:08:53.065 read: IOPS=108k, BW=420MiB/s (440MB/s)(4201MiB/10001msec) 00:08:53.065 slat (usec): min=2, max=403, avg=28.26, stdev=17.96 00:08:53.065 clat (usec): min=6, max=3506, avg=242.87, stdev=133.66 00:08:53.065 lat (usec): min=12, max=3546, avg=271.14, stdev=141.45 00:08:53.065 clat percentiles (usec): 00:08:53.065 | 50.000th=[ 233], 99.000th=[ 594], 99.900th=[ 758], 99.990th=[ 930], 00:08:53.065 | 99.999th=[ 1172] 00:08:53.065 write: IOPS=168k, BW=655MiB/s (687MB/s)(6472MiB/9883msec); 0 zone resets 00:08:53.065 slat (usec): min=3, max=1031, avg=41.45, stdev=19.91 00:08:53.065 clat (usec): min=5, max=2104, avg=296.45, stdev=155.19 00:08:53.065 lat (usec): min=19, max=2118, avg=337.90, stdev=163.92 00:08:53.065 clat percentiles (usec): 00:08:53.065 | 50.000th=[ 281], 99.000th=[ 725], 99.900th=[ 922], 99.990th=[ 1123], 00:08:53.065 | 99.999th=[ 1369] 00:08:53.065 bw ( KiB/s): min=496080, max=791584, per=98.69%, avg=661755.53, stdev=5624.84, samples=304 00:08:53.065 iops : min=124020, max=197896, avg=165438.74, stdev=1406.21, samples=304 00:08:53.065 lat (usec) : 10=0.01%, 20=0.13%, 50=2.41%, 100=9.21%, 250=35.97% 00:08:53.065 lat (usec) : 500=44.39%, 750=7.37%, 1000=0.48% 00:08:53.065 lat (msec) : 2=0.03%, 4=0.01% 00:08:53.065 cpu : usr=99.36%, sys=0.29%, ctx=549, majf=0, minf=2408 00:08:53.065 IO depths : 1=12.4%, 2=24.8%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:53.065 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:53.065 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:53.065 issued rwts: total=1075512,1656801,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:53.065 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:53.065 00:08:53.065 Run status group 0 (all jobs): 00:08:53.065 READ: bw=420MiB/s (440MB/s), 420MiB/s-420MiB/s (440MB/s-440MB/s), io=4201MiB (4405MB), run=10001-10001msec 00:08:53.065 WRITE: bw=655MiB/s (687MB/s), 655MiB/s-655MiB/s (687MB/s-687MB/s), io=6472MiB (6786MB), run=9883-9883msec 00:08:53.065 00:08:53.065 real 0m11.416s 00:08:53.065 user 2m48.394s 00:08:53.065 sys 0m1.863s 00:08:53.065 17:21:03 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:53.065 17:21:03 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:08:53.065 ************************************ 00:08:53.065 END TEST bdev_fio_rw_verify 00:08:53.065 ************************************ 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:08:53.065 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:53.066 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "783c3a2b-3787-4c0e-ae55-7ff98877cdd7"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "783c3a2b-3787-4c0e-ae55-7ff98877cdd7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "dc95ede7-9d91-53ee-af3b-7462c144040e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dc95ede7-9d91-53ee-af3b-7462c144040e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "85d1209c-bb8c-546f-965c-641acdae5880"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "85d1209c-bb8c-546f-965c-641acdae5880",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "795144fb-92e6-5477-8203-8053ef758c82"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "795144fb-92e6-5477-8203-8053ef758c82",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "e184d899-974f-5879-a68f-675d999bbc1f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e184d899-974f-5879-a68f-675d999bbc1f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "149c8888-c44d-54de-8af3-ac288439ea7d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "149c8888-c44d-54de-8af3-ac288439ea7d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "ff218920-1d8c-5d68-ba2a-c1eb61dad054"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ff218920-1d8c-5d68-ba2a-c1eb61dad054",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "af5246d3-6b33-5225-913f-6bef82defa51"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "af5246d3-6b33-5225-913f-6bef82defa51",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "f30397eb-85f7-5b53-954e-370ceba49321"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f30397eb-85f7-5b53-954e-370ceba49321",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "52cb3836-d4c6-5071-ba8f-84d59c7e2f25"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "52cb3836-d4c6-5071-ba8f-84d59c7e2f25",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "3090cb68-a85b-53d8-8e8c-0673ad50a73e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3090cb68-a85b-53d8-8e8c-0673ad50a73e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "0b63feb1-8776-55c5-9b20-f1653ddf0d46"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0b63feb1-8776-55c5-9b20-f1653ddf0d46",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "a91821db-734c-4c67-8369-f9abbcd20eea"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a91821db-734c-4c67-8369-f9abbcd20eea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a91821db-734c-4c67-8369-f9abbcd20eea",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "bd4d7d55-dc2b-4446-9f17-4485d5e61322",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "e18e59c0-90d8-4eca-821d-797e7bdb60ff",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "2ea8aec2-9b67-40ef-8585-6bbd3f800281"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2ea8aec2-9b67-40ef-8585-6bbd3f800281",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2ea8aec2-9b67-40ef-8585-6bbd3f800281",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "0a632413-39e0-4a71-bcd7-5b0ef74a076c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "86bfda2e-b6f7-4bab-beb3-2eaea3f1bd21",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "c00f0d03-575e-498c-9995-5f5627569b01"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c00f0d03-575e-498c-9995-5f5627569b01",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c00f0d03-575e-498c-9995-5f5627569b01",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "e196ea9f-6a05-4ad8-b3fe-de8ad218f6f1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "27b52a6e-0b30-4809-b587-1a4fed8098c7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "6e28aca0-7f11-4811-b299-0b3569dec791"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "6e28aca0-7f11-4811-b299-0b3569dec791",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:53.066 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:08:53.066 Malloc1p0 00:08:53.066 Malloc1p1 00:08:53.066 Malloc2p0 00:08:53.066 Malloc2p1 00:08:53.066 Malloc2p2 00:08:53.066 Malloc2p3 00:08:53.066 Malloc2p4 00:08:53.066 Malloc2p5 00:08:53.066 Malloc2p6 00:08:53.066 Malloc2p7 00:08:53.066 TestPT 00:08:53.066 raid0 00:08:53.066 concat0 ]] 00:08:53.066 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:53.067 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "783c3a2b-3787-4c0e-ae55-7ff98877cdd7"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "783c3a2b-3787-4c0e-ae55-7ff98877cdd7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "dc95ede7-9d91-53ee-af3b-7462c144040e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dc95ede7-9d91-53ee-af3b-7462c144040e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "85d1209c-bb8c-546f-965c-641acdae5880"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "85d1209c-bb8c-546f-965c-641acdae5880",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "795144fb-92e6-5477-8203-8053ef758c82"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "795144fb-92e6-5477-8203-8053ef758c82",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "e184d899-974f-5879-a68f-675d999bbc1f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e184d899-974f-5879-a68f-675d999bbc1f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "149c8888-c44d-54de-8af3-ac288439ea7d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "149c8888-c44d-54de-8af3-ac288439ea7d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "ff218920-1d8c-5d68-ba2a-c1eb61dad054"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ff218920-1d8c-5d68-ba2a-c1eb61dad054",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "af5246d3-6b33-5225-913f-6bef82defa51"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "af5246d3-6b33-5225-913f-6bef82defa51",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "f30397eb-85f7-5b53-954e-370ceba49321"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f30397eb-85f7-5b53-954e-370ceba49321",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "52cb3836-d4c6-5071-ba8f-84d59c7e2f25"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "52cb3836-d4c6-5071-ba8f-84d59c7e2f25",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "3090cb68-a85b-53d8-8e8c-0673ad50a73e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3090cb68-a85b-53d8-8e8c-0673ad50a73e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "0b63feb1-8776-55c5-9b20-f1653ddf0d46"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0b63feb1-8776-55c5-9b20-f1653ddf0d46",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "a91821db-734c-4c67-8369-f9abbcd20eea"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a91821db-734c-4c67-8369-f9abbcd20eea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a91821db-734c-4c67-8369-f9abbcd20eea",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "bd4d7d55-dc2b-4446-9f17-4485d5e61322",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "e18e59c0-90d8-4eca-821d-797e7bdb60ff",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "2ea8aec2-9b67-40ef-8585-6bbd3f800281"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2ea8aec2-9b67-40ef-8585-6bbd3f800281",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2ea8aec2-9b67-40ef-8585-6bbd3f800281",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "0a632413-39e0-4a71-bcd7-5b0ef74a076c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "86bfda2e-b6f7-4bab-beb3-2eaea3f1bd21",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "c00f0d03-575e-498c-9995-5f5627569b01"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c00f0d03-575e-498c-9995-5f5627569b01",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c00f0d03-575e-498c-9995-5f5627569b01",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "e196ea9f-6a05-4ad8-b3fe-de8ad218f6f1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "27b52a6e-0b30-4809-b587-1a4fed8098c7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "6e28aca0-7f11-4811-b299-0b3569dec791"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "6e28aca0-7f11-4811-b299-0b3569dec791",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:53.067 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.067 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:08:53.067 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:08:53.067 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.067 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:08:53.067 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:08:53.067 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:53.068 17:21:03 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:53.068 ************************************ 00:08:53.068 START TEST bdev_fio_trim 00:08:53.068 ************************************ 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:53.068 17:21:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:53.068 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:53.068 fio-3.35 00:08:53.068 Starting 14 threads 00:09:05.306 00:09:05.306 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=2725357: Mon Jul 15 17:21:14 2024 00:09:05.306 write: IOPS=159k, BW=622MiB/s (652MB/s)(6220MiB/10001msec); 0 zone resets 00:09:05.306 slat (usec): min=2, max=3568, avg=30.16, stdev=15.72 00:09:05.306 clat (usec): min=22, max=4022, avg=229.79, stdev=99.05 00:09:05.306 lat (usec): min=30, max=4062, avg=259.95, stdev=104.75 00:09:05.306 clat percentiles (usec): 00:09:05.306 | 50.000th=[ 217], 99.000th=[ 545], 99.900th=[ 725], 99.990th=[ 881], 00:09:05.306 | 99.999th=[ 1012] 00:09:05.306 bw ( KiB/s): min=433632, max=745184, per=100.00%, avg=641122.21, stdev=8508.30, samples=266 00:09:05.306 iops : min=108408, max=186296, avg=160280.42, stdev=2127.07, samples=266 00:09:05.306 trim: IOPS=159k, BW=622MiB/s (652MB/s)(6220MiB/10001msec); 0 zone resets 00:09:05.306 slat (usec): min=3, max=398, avg=19.73, stdev= 9.70 00:09:05.306 clat (usec): min=3, max=4062, avg=251.04, stdev=108.10 00:09:05.306 lat (usec): min=10, max=4090, avg=270.77, stdev=112.56 00:09:05.306 clat percentiles (usec): 00:09:05.306 | 50.000th=[ 239], 99.000th=[ 603], 99.900th=[ 799], 99.990th=[ 963], 00:09:05.306 | 99.999th=[ 1106] 00:09:05.306 bw ( KiB/s): min=433632, max=745184, per=100.00%, avg=641122.63, stdev=8508.47, samples=266 00:09:05.306 iops : min=108408, max=186296, avg=160280.53, stdev=2127.12, samples=266 00:09:05.306 lat (usec) : 4=0.01%, 10=0.07%, 20=0.18%, 50=0.83%, 100=4.12% 00:09:05.306 lat (usec) : 250=54.14%, 500=38.43%, 750=2.09%, 1000=0.13% 00:09:05.306 lat (msec) : 2=0.01%, 4=0.01%, 10=0.01% 00:09:05.306 cpu : usr=99.62%, sys=0.00%, ctx=634, majf=0, minf=1036 00:09:05.306 IO depths : 1=12.3%, 2=24.7%, 4=50.0%, 8=13.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:05.306 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:05.306 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:05.306 issued rwts: total=0,1592396,1592398,0 short=0,0,0,0 dropped=0,0,0,0 00:09:05.306 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:05.306 00:09:05.306 Run status group 0 (all jobs): 00:09:05.306 WRITE: bw=622MiB/s (652MB/s), 622MiB/s-622MiB/s (652MB/s-652MB/s), io=6220MiB (6522MB), run=10001-10001msec 00:09:05.306 TRIM: bw=622MiB/s (652MB/s), 622MiB/s-622MiB/s (652MB/s-652MB/s), io=6220MiB (6522MB), run=10001-10001msec 00:09:05.306 00:09:05.306 real 0m11.370s 00:09:05.306 user 2m29.335s 00:09:05.306 sys 0m0.936s 00:09:05.306 17:21:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:05.306 17:21:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:05.306 ************************************ 00:09:05.306 END TEST bdev_fio_trim 00:09:05.306 ************************************ 00:09:05.306 17:21:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:05.306 17:21:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:05.306 17:21:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:05.306 17:21:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:05.306 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:05.306 17:21:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:05.306 00:09:05.306 real 0m23.137s 00:09:05.306 user 5m17.936s 00:09:05.306 sys 0m2.967s 00:09:05.306 17:21:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:05.306 17:21:14 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:05.306 ************************************ 00:09:05.306 END TEST bdev_fio 00:09:05.306 ************************************ 00:09:05.306 17:21:14 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:05.306 17:21:14 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:05.306 17:21:14 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:05.306 17:21:14 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:05.306 17:21:14 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:05.306 17:21:14 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.306 ************************************ 00:09:05.306 START TEST bdev_verify 00:09:05.306 ************************************ 00:09:05.306 17:21:14 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:05.306 [2024-07-15 17:21:14.879094] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:09:05.306 [2024-07-15 17:21:14.879145] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2727654 ] 00:09:05.306 [2024-07-15 17:21:14.968865] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:05.306 [2024-07-15 17:21:15.033016] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:05.306 [2024-07-15 17:21:15.033114] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:05.306 [2024-07-15 17:21:15.149739] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:05.306 [2024-07-15 17:21:15.149780] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:05.306 [2024-07-15 17:21:15.149788] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:05.306 [2024-07-15 17:21:15.157744] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:05.306 [2024-07-15 17:21:15.157762] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:05.306 [2024-07-15 17:21:15.165761] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:05.306 [2024-07-15 17:21:15.165777] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:05.306 [2024-07-15 17:21:15.226633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:05.306 [2024-07-15 17:21:15.226672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:05.306 [2024-07-15 17:21:15.226683] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2185990 00:09:05.306 [2024-07-15 17:21:15.226689] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:05.306 [2024-07-15 17:21:15.227965] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:05.306 [2024-07-15 17:21:15.227987] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:05.306 Running I/O for 5 seconds... 00:09:09.590 00:09:09.590 Latency(us) 00:09:09.590 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:09.590 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x1000 00:09:09.590 Malloc0 : 5.06 1467.99 5.73 0.00 0.00 87003.02 401.72 327478.35 00:09:09.590 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x1000 length 0x1000 00:09:09.590 Malloc0 : 5.16 1240.79 4.85 0.00 0.00 102944.12 510.42 354902.65 00:09:09.590 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x800 00:09:09.590 Malloc1p0 : 5.06 759.04 2.96 0.00 0.00 167679.06 2243.35 169385.35 00:09:09.590 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x800 length 0x800 00:09:09.590 Malloc1p0 : 5.16 644.94 2.52 0.00 0.00 197323.91 2697.06 192776.66 00:09:09.590 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x800 00:09:09.590 Malloc1p1 : 5.18 765.82 2.99 0.00 0.00 165742.79 2066.90 162125.98 00:09:09.590 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x800 length 0x800 00:09:09.590 Malloc1p1 : 5.16 644.68 2.52 0.00 0.00 196786.42 2495.41 189550.28 00:09:09.590 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x200 00:09:09.590 Malloc2p0 : 5.18 765.58 2.99 0.00 0.00 165385.34 1940.87 158899.59 00:09:09.590 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x200 length 0x200 00:09:09.590 Malloc2p0 : 5.16 644.41 2.52 0.00 0.00 196318.66 2394.58 187937.08 00:09:09.590 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x200 00:09:09.590 Malloc2p1 : 5.18 765.32 2.99 0.00 0.00 165073.96 1890.46 158899.59 00:09:09.590 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x200 length 0x200 00:09:09.590 Malloc2p1 : 5.17 644.15 2.52 0.00 0.00 195914.76 2356.78 186323.89 00:09:09.590 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x200 00:09:09.590 Malloc2p2 : 5.19 765.07 2.99 0.00 0.00 164776.49 1877.86 159706.19 00:09:09.590 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x200 length 0x200 00:09:09.590 Malloc2p2 : 5.17 643.88 2.52 0.00 0.00 195559.28 2369.38 185517.29 00:09:09.590 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x200 00:09:09.590 Malloc2p3 : 5.19 764.82 2.99 0.00 0.00 164473.60 1928.27 159706.19 00:09:09.590 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x200 length 0x200 00:09:09.590 Malloc2p3 : 5.17 643.62 2.51 0.00 0.00 195194.19 2482.81 184710.70 00:09:09.590 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x200 00:09:09.590 Malloc2p4 : 5.19 764.57 2.99 0.00 0.00 164202.29 2003.89 158899.59 00:09:09.590 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x200 length 0x200 00:09:09.590 Malloc2p4 : 5.17 643.36 2.51 0.00 0.00 194751.46 2558.42 179064.52 00:09:09.590 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x200 00:09:09.590 Malloc2p5 : 5.19 764.32 2.99 0.00 0.00 163863.60 2117.32 154866.61 00:09:09.590 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x200 length 0x200 00:09:09.590 Malloc2p5 : 5.17 643.10 2.51 0.00 0.00 194231.51 2659.25 174224.94 00:09:09.590 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x200 00:09:09.590 Malloc2p6 : 5.19 764.08 2.98 0.00 0.00 163471.58 2205.54 150027.03 00:09:09.590 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x200 length 0x200 00:09:09.590 Malloc2p6 : 5.18 642.84 2.51 0.00 0.00 193664.45 2545.82 169385.35 00:09:09.590 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x200 00:09:09.590 Malloc2p7 : 5.19 763.83 2.98 0.00 0.00 163044.74 2117.32 144380.85 00:09:09.590 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x200 length 0x200 00:09:09.590 Malloc2p7 : 5.23 660.81 2.58 0.00 0.00 187944.28 2293.76 164545.77 00:09:09.590 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x0 length 0x1000 00:09:09.590 TestPT : 5.22 760.41 2.97 0.00 0.00 163220.92 9931.22 144380.85 00:09:09.590 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.590 Verification LBA range: start 0x1000 length 0x1000 00:09:09.591 TestPT : 5.22 637.85 2.49 0.00 0.00 194210.87 31658.93 165352.37 00:09:09.591 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.591 Verification LBA range: start 0x0 length 0x2000 00:09:09.591 raid0 : 5.20 763.32 2.98 0.00 0.00 162351.99 1903.06 137928.07 00:09:09.591 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.591 Verification LBA range: start 0x2000 length 0x2000 00:09:09.591 raid0 : 5.23 660.56 2.58 0.00 0.00 187088.78 2533.22 153253.42 00:09:09.591 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.591 Verification LBA range: start 0x0 length 0x2000 00:09:09.591 concat0 : 5.20 763.08 2.98 0.00 0.00 162089.11 2029.10 137121.48 00:09:09.591 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.591 Verification LBA range: start 0x2000 length 0x2000 00:09:09.591 concat0 : 5.23 660.30 2.58 0.00 0.00 186794.77 2407.19 149220.43 00:09:09.591 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.591 Verification LBA range: start 0x0 length 0x1000 00:09:09.591 raid1 : 5.22 783.94 3.06 0.00 0.00 157471.49 2142.52 147607.24 00:09:09.591 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.591 Verification LBA range: start 0x1000 length 0x1000 00:09:09.591 raid1 : 5.24 660.04 2.58 0.00 0.00 186480.92 3075.15 158093.00 00:09:09.591 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.591 Verification LBA range: start 0x0 length 0x4e2 00:09:09.591 AIO0 : 5.23 783.74 3.06 0.00 0.00 157238.72 882.22 155673.21 00:09:09.591 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.591 Verification LBA range: start 0x4e2 length 0x4e2 00:09:09.591 AIO0 : 5.24 659.85 2.58 0.00 0.00 186054.06 1127.98 168578.76 00:09:09.591 =================================================================================================================== 00:09:09.591 Total : 23940.11 93.52 0.00 0.00 167516.73 401.72 354902.65 00:09:09.852 00:09:09.852 real 0m6.143s 00:09:09.852 user 0m11.606s 00:09:09.852 sys 0m0.256s 00:09:09.852 17:21:20 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:09.852 17:21:20 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:09.852 ************************************ 00:09:09.852 END TEST bdev_verify 00:09:09.852 ************************************ 00:09:09.852 17:21:21 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:09.852 17:21:21 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:09.852 17:21:21 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:09.852 17:21:21 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:09.852 17:21:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:09.852 ************************************ 00:09:09.852 START TEST bdev_verify_big_io 00:09:09.852 ************************************ 00:09:09.852 17:21:21 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:09.852 [2024-07-15 17:21:21.101932] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:09:09.852 [2024-07-15 17:21:21.101983] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2728803 ] 00:09:10.113 [2024-07-15 17:21:21.191008] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:10.113 [2024-07-15 17:21:21.262806] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:10.113 [2024-07-15 17:21:21.262819] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:10.113 [2024-07-15 17:21:21.382599] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:10.113 [2024-07-15 17:21:21.382637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:10.113 [2024-07-15 17:21:21.382646] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:10.113 [2024-07-15 17:21:21.390611] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:10.113 [2024-07-15 17:21:21.390628] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:10.113 [2024-07-15 17:21:21.398622] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:10.113 [2024-07-15 17:21:21.398637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:10.373 [2024-07-15 17:21:21.459510] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:10.373 [2024-07-15 17:21:21.459546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:10.373 [2024-07-15 17:21:21.459556] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c36990 00:09:10.373 [2024-07-15 17:21:21.459563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:10.373 [2024-07-15 17:21:21.460827] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:10.373 [2024-07-15 17:21:21.460845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:10.373 [2024-07-15 17:21:21.597868] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:10.373 [2024-07-15 17:21:21.598577] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:10.373 [2024-07-15 17:21:21.599794] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:10.373 [2024-07-15 17:21:21.600668] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.601892] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.602754] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.604005] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.605247] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.606126] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.607376] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.608251] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.609467] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.610196] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.611201] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.611882] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.612900] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:10.374 [2024-07-15 17:21:21.628875] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:10.374 [2024-07-15 17:21:21.630188] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:10.374 Running I/O for 5 seconds... 00:09:18.514 00:09:18.514 Latency(us) 00:09:18.514 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:18.514 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x100 00:09:18.514 Malloc0 : 5.75 133.63 8.35 0.00 0.00 937659.45 743.58 2490771.30 00:09:18.514 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x100 length 0x100 00:09:18.514 Malloc0 : 5.90 130.09 8.13 0.00 0.00 961903.09 901.12 2632732.36 00:09:18.514 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x80 00:09:18.514 Malloc1p0 : 6.28 77.67 4.85 0.00 0.00 1468114.16 1890.46 2903748.92 00:09:18.514 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x80 length 0x80 00:09:18.514 Malloc1p0 : 7.07 31.68 1.98 0.00 0.00 3542424.88 1449.35 5807497.85 00:09:18.514 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x80 00:09:18.514 Malloc1p1 : 6.53 36.73 2.30 0.00 0.00 3096702.36 1216.20 5420331.32 00:09:18.514 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x80 length 0x80 00:09:18.514 Malloc1p1 : 7.07 31.67 1.98 0.00 0.00 3407800.59 1518.67 5549386.83 00:09:18.514 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x20 00:09:18.514 Malloc2p0 : 6.29 22.91 1.43 0.00 0.00 1219011.23 557.69 2129415.88 00:09:18.514 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x20 length 0x20 00:09:18.514 Malloc2p0 : 6.39 22.52 1.41 0.00 0.00 1239292.51 579.74 2193943.63 00:09:18.514 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x20 00:09:18.514 Malloc2p1 : 6.29 22.91 1.43 0.00 0.00 1207632.44 485.22 2090699.22 00:09:18.514 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x20 length 0x20 00:09:18.514 Malloc2p1 : 6.40 22.51 1.41 0.00 0.00 1227135.97 586.04 2168132.53 00:09:18.514 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x20 00:09:18.514 Malloc2p2 : 6.29 22.90 1.43 0.00 0.00 1196195.68 485.22 2064888.12 00:09:18.514 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x20 length 0x20 00:09:18.514 Malloc2p2 : 6.40 22.50 1.41 0.00 0.00 1214535.06 595.50 2129415.88 00:09:18.514 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x20 00:09:18.514 Malloc2p3 : 6.29 22.90 1.43 0.00 0.00 1184309.99 494.67 2039077.02 00:09:18.514 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x20 length 0x20 00:09:18.514 Malloc2p3 : 6.40 22.49 1.41 0.00 0.00 1202209.31 579.74 2103604.78 00:09:18.514 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x20 00:09:18.514 Malloc2p4 : 6.39 25.03 1.56 0.00 0.00 1085455.90 494.67 2013265.92 00:09:18.514 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x20 length 0x20 00:09:18.514 Malloc2p4 : 6.40 22.49 1.41 0.00 0.00 1188413.23 579.74 2077793.67 00:09:18.514 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x20 00:09:18.514 Malloc2p5 : 6.39 25.03 1.56 0.00 0.00 1075138.39 494.67 1987454.82 00:09:18.514 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x20 length 0x20 00:09:18.514 Malloc2p5 : 6.40 22.48 1.41 0.00 0.00 1175450.90 582.89 2039077.02 00:09:18.514 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x20 00:09:18.514 Malloc2p6 : 6.39 25.02 1.56 0.00 0.00 1064502.89 482.07 1948738.17 00:09:18.514 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x20 length 0x20 00:09:18.514 Malloc2p6 : 6.41 22.48 1.40 0.00 0.00 1162563.93 586.04 2013265.92 00:09:18.514 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x20 00:09:18.514 Malloc2p7 : 6.40 25.01 1.56 0.00 0.00 1053857.25 497.82 1922927.06 00:09:18.514 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x20 length 0x20 00:09:18.514 Malloc2p7 : 6.52 24.52 1.53 0.00 0.00 1060977.65 570.29 1974549.27 00:09:18.514 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x100 00:09:18.514 TestPT : 6.91 35.02 2.19 0.00 0.00 2846999.66 83886.08 3897476.33 00:09:18.514 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x100 length 0x100 00:09:18.514 TestPT : 6.77 35.44 2.22 0.00 0.00 2818088.33 117763.15 3897476.33 00:09:18.514 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x200 00:09:18.514 raid0 : 6.79 42.40 2.65 0.00 0.00 2307828.26 1285.51 4723431.58 00:09:18.514 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x200 length 0x200 00:09:18.514 raid0 : 7.08 40.70 2.54 0.00 0.00 2346757.66 1550.18 4671809.38 00:09:18.514 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x200 00:09:18.514 concat0 : 6.96 45.98 2.87 0.00 0.00 2061786.42 1298.12 4516942.77 00:09:18.514 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x200 length 0x200 00:09:18.514 concat0 : 7.08 47.47 2.97 0.00 0.00 1995975.11 1562.78 4465320.57 00:09:18.514 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x100 00:09:18.514 raid1 : 6.96 59.76 3.73 0.00 0.00 1559052.44 1663.61 4310453.96 00:09:18.514 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x100 length 0x100 00:09:18.514 raid1 : 7.10 54.12 3.38 0.00 0.00 1691448.17 2029.10 4258831.75 00:09:18.514 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x0 length 0x4e 00:09:18.514 AIO0 : 7.07 73.54 4.60 0.00 0.00 755130.03 598.65 3006993.33 00:09:18.514 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:18.514 Verification LBA range: start 0x4e length 0x4e 00:09:18.514 AIO0 : 7.14 75.30 4.71 0.00 0.00 716677.16 715.22 3006993.33 00:09:18.514 =================================================================================================================== 00:09:18.514 Total : 1324.90 82.81 0.00 0.00 1539633.24 482.07 5807497.85 00:09:18.514 00:09:18.514 real 0m8.037s 00:09:18.514 user 0m15.397s 00:09:18.514 sys 0m0.273s 00:09:18.514 17:21:29 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:18.514 17:21:29 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:18.514 ************************************ 00:09:18.514 END TEST bdev_verify_big_io 00:09:18.514 ************************************ 00:09:18.514 17:21:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:18.514 17:21:29 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:18.515 17:21:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:18.515 17:21:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:18.515 17:21:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:18.515 ************************************ 00:09:18.515 START TEST bdev_write_zeroes 00:09:18.515 ************************************ 00:09:18.515 17:21:29 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:18.515 [2024-07-15 17:21:29.215690] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:09:18.515 [2024-07-15 17:21:29.215769] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2730132 ] 00:09:18.515 [2024-07-15 17:21:29.304016] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:18.515 [2024-07-15 17:21:29.380375] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.515 [2024-07-15 17:21:29.499864] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:18.515 [2024-07-15 17:21:29.499902] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:18.515 [2024-07-15 17:21:29.499910] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:18.515 [2024-07-15 17:21:29.507870] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:18.515 [2024-07-15 17:21:29.507888] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:18.515 [2024-07-15 17:21:29.515895] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:18.515 [2024-07-15 17:21:29.515911] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:18.515 [2024-07-15 17:21:29.576662] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:18.515 [2024-07-15 17:21:29.576700] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:18.515 [2024-07-15 17:21:29.576715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb7b0f0 00:09:18.515 [2024-07-15 17:21:29.576721] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:18.515 [2024-07-15 17:21:29.577856] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:18.515 [2024-07-15 17:21:29.577875] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:18.515 Running I/O for 1 seconds... 00:09:19.897 00:09:19.897 Latency(us) 00:09:19.897 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:19.897 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.897 Malloc0 : 1.03 5966.63 23.31 0.00 0.00 21446.04 516.73 36095.21 00:09:19.897 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.897 Malloc1p0 : 1.03 5959.25 23.28 0.00 0.00 21439.18 787.69 35288.62 00:09:19.897 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.897 Malloc1p1 : 1.03 5951.91 23.25 0.00 0.00 21421.21 775.09 34482.02 00:09:19.897 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.897 Malloc2p0 : 1.03 5944.55 23.22 0.00 0.00 21409.52 765.64 33675.42 00:09:19.897 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.897 Malloc2p1 : 1.03 5937.28 23.19 0.00 0.00 21399.46 759.34 33070.47 00:09:19.897 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 Malloc2p2 : 1.04 5930.00 23.16 0.00 0.00 21383.84 765.64 32263.88 00:09:19.898 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 Malloc2p3 : 1.04 5922.72 23.14 0.00 0.00 21369.23 756.18 31457.28 00:09:19.898 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 Malloc2p4 : 1.04 5915.47 23.11 0.00 0.00 21358.84 762.49 30852.33 00:09:19.898 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 Malloc2p5 : 1.04 5908.27 23.08 0.00 0.00 21343.02 737.28 30045.74 00:09:19.898 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 Malloc2p6 : 1.04 5901.03 23.05 0.00 0.00 21328.73 746.73 29239.14 00:09:19.898 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 Malloc2p7 : 1.04 5893.83 23.02 0.00 0.00 21317.51 746.73 28634.19 00:09:19.898 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 TestPT : 1.05 5956.18 23.27 0.00 0.00 21057.96 790.84 27827.59 00:09:19.898 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 raid0 : 1.05 5947.90 23.23 0.00 0.00 21037.47 1443.05 26416.05 00:09:19.898 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 concat0 : 1.06 5939.82 23.20 0.00 0.00 20997.66 1417.85 24903.68 00:09:19.898 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 raid1 : 1.06 5929.64 23.16 0.00 0.00 20952.32 2230.74 23088.84 00:09:19.898 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:19.898 AIO0 : 1.06 5923.47 23.14 0.00 0.00 20886.78 781.39 23088.84 00:09:19.898 =================================================================================================================== 00:09:19.898 Total : 94927.94 370.81 0.00 0.00 21257.53 516.73 36095.21 00:09:19.898 00:09:19.898 real 0m1.861s 00:09:19.898 user 0m1.590s 00:09:19.898 sys 0m0.223s 00:09:19.898 17:21:31 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:19.898 17:21:31 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:19.898 ************************************ 00:09:19.898 END TEST bdev_write_zeroes 00:09:19.898 ************************************ 00:09:19.898 17:21:31 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:19.898 17:21:31 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:19.898 17:21:31 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:19.898 17:21:31 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.898 17:21:31 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:19.898 ************************************ 00:09:19.898 START TEST bdev_json_nonenclosed 00:09:19.898 ************************************ 00:09:19.898 17:21:31 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:19.898 [2024-07-15 17:21:31.165465] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:09:19.898 [2024-07-15 17:21:31.165518] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2730459 ] 00:09:20.158 [2024-07-15 17:21:31.254590] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.158 [2024-07-15 17:21:31.331435] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.158 [2024-07-15 17:21:31.331492] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:20.158 [2024-07-15 17:21:31.331504] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:20.158 [2024-07-15 17:21:31.331510] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:20.158 00:09:20.158 real 0m0.287s 00:09:20.158 user 0m0.182s 00:09:20.158 sys 0m0.103s 00:09:20.158 17:21:31 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:09:20.158 17:21:31 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:20.158 17:21:31 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:20.158 ************************************ 00:09:20.158 END TEST bdev_json_nonenclosed 00:09:20.158 ************************************ 00:09:20.158 17:21:31 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:20.158 17:21:31 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:09:20.158 17:21:31 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:20.158 17:21:31 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:20.158 17:21:31 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:20.158 17:21:31 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:20.419 ************************************ 00:09:20.419 START TEST bdev_json_nonarray 00:09:20.419 ************************************ 00:09:20.419 17:21:31 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:20.419 [2024-07-15 17:21:31.518775] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:09:20.419 [2024-07-15 17:21:31.518826] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2730502 ] 00:09:20.419 [2024-07-15 17:21:31.609106] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.419 [2024-07-15 17:21:31.683871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.419 [2024-07-15 17:21:31.683933] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:20.419 [2024-07-15 17:21:31.683945] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:20.419 [2024-07-15 17:21:31.683952] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:20.679 00:09:20.679 real 0m0.275s 00:09:20.679 user 0m0.173s 00:09:20.679 sys 0m0.099s 00:09:20.679 17:21:31 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:09:20.679 17:21:31 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:20.679 17:21:31 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:20.679 ************************************ 00:09:20.679 END TEST bdev_json_nonarray 00:09:20.679 ************************************ 00:09:20.679 17:21:31 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:20.679 17:21:31 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:09:20.679 17:21:31 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:20.679 17:21:31 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:20.679 17:21:31 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:20.679 17:21:31 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:20.679 17:21:31 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:20.679 ************************************ 00:09:20.679 START TEST bdev_qos 00:09:20.679 ************************************ 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=2730654 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 2730654' 00:09:20.679 Process qos testing pid: 2730654 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 2730654 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 2730654 ']' 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:20.679 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:20.679 17:21:31 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:20.679 [2024-07-15 17:21:31.875276] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:09:20.679 [2024-07-15 17:21:31.875326] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2730654 ] 00:09:20.679 [2024-07-15 17:21:31.955137] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.941 [2024-07-15 17:21:32.054337] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:21.202 Malloc_0 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:21.202 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:21.202 [ 00:09:21.202 { 00:09:21.202 "name": "Malloc_0", 00:09:21.202 "aliases": [ 00:09:21.202 "32c3262b-f9ff-42e4-8867-ca040accdf2d" 00:09:21.202 ], 00:09:21.202 "product_name": "Malloc disk", 00:09:21.202 "block_size": 512, 00:09:21.202 "num_blocks": 262144, 00:09:21.202 "uuid": "32c3262b-f9ff-42e4-8867-ca040accdf2d", 00:09:21.202 "assigned_rate_limits": { 00:09:21.202 "rw_ios_per_sec": 0, 00:09:21.202 "rw_mbytes_per_sec": 0, 00:09:21.202 "r_mbytes_per_sec": 0, 00:09:21.203 "w_mbytes_per_sec": 0 00:09:21.203 }, 00:09:21.203 "claimed": false, 00:09:21.203 "zoned": false, 00:09:21.203 "supported_io_types": { 00:09:21.203 "read": true, 00:09:21.203 "write": true, 00:09:21.203 "unmap": true, 00:09:21.203 "flush": true, 00:09:21.203 "reset": true, 00:09:21.203 "nvme_admin": false, 00:09:21.203 "nvme_io": false, 00:09:21.203 "nvme_io_md": false, 00:09:21.203 "write_zeroes": true, 00:09:21.203 "zcopy": true, 00:09:21.203 "get_zone_info": false, 00:09:21.203 "zone_management": false, 00:09:21.203 "zone_append": false, 00:09:21.203 "compare": false, 00:09:21.203 "compare_and_write": false, 00:09:21.203 "abort": true, 00:09:21.203 "seek_hole": false, 00:09:21.203 "seek_data": false, 00:09:21.203 "copy": true, 00:09:21.203 "nvme_iov_md": false 00:09:21.203 }, 00:09:21.203 "memory_domains": [ 00:09:21.203 { 00:09:21.203 "dma_device_id": "system", 00:09:21.203 "dma_device_type": 1 00:09:21.203 }, 00:09:21.203 { 00:09:21.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:21.203 "dma_device_type": 2 00:09:21.203 } 00:09:21.203 ], 00:09:21.203 "driver_specific": {} 00:09:21.203 } 00:09:21.203 ] 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:21.203 Null_1 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:21.203 [ 00:09:21.203 { 00:09:21.203 "name": "Null_1", 00:09:21.203 "aliases": [ 00:09:21.203 "0c624afc-7ff8-4486-adbc-2f095dc7594f" 00:09:21.203 ], 00:09:21.203 "product_name": "Null disk", 00:09:21.203 "block_size": 512, 00:09:21.203 "num_blocks": 262144, 00:09:21.203 "uuid": "0c624afc-7ff8-4486-adbc-2f095dc7594f", 00:09:21.203 "assigned_rate_limits": { 00:09:21.203 "rw_ios_per_sec": 0, 00:09:21.203 "rw_mbytes_per_sec": 0, 00:09:21.203 "r_mbytes_per_sec": 0, 00:09:21.203 "w_mbytes_per_sec": 0 00:09:21.203 }, 00:09:21.203 "claimed": false, 00:09:21.203 "zoned": false, 00:09:21.203 "supported_io_types": { 00:09:21.203 "read": true, 00:09:21.203 "write": true, 00:09:21.203 "unmap": false, 00:09:21.203 "flush": false, 00:09:21.203 "reset": true, 00:09:21.203 "nvme_admin": false, 00:09:21.203 "nvme_io": false, 00:09:21.203 "nvme_io_md": false, 00:09:21.203 "write_zeroes": true, 00:09:21.203 "zcopy": false, 00:09:21.203 "get_zone_info": false, 00:09:21.203 "zone_management": false, 00:09:21.203 "zone_append": false, 00:09:21.203 "compare": false, 00:09:21.203 "compare_and_write": false, 00:09:21.203 "abort": true, 00:09:21.203 "seek_hole": false, 00:09:21.203 "seek_data": false, 00:09:21.203 "copy": false, 00:09:21.203 "nvme_iov_md": false 00:09:21.203 }, 00:09:21.203 "driver_specific": {} 00:09:21.203 } 00:09:21.203 ] 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:21.203 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:21.463 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:21.463 17:21:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:21.463 Running I/O for 60 seconds... 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 58668.07 234672.30 0.00 0.00 235520.00 0.00 0.00 ' 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=58668.07 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 58668 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=58668 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=14000 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 14000 -gt 1000 ']' 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 14000 Malloc_0 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 14000 IOPS Malloc_0 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:26.746 17:21:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:26.746 ************************************ 00:09:26.746 START TEST bdev_qos_iops 00:09:26.746 ************************************ 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 14000 IOPS Malloc_0 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=14000 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:26.746 17:21:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 14001.40 56005.58 0.00 0.00 57400.00 0.00 0.00 ' 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=14001.40 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 14001 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=14001 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=12600 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=15400 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14001 -lt 12600 ']' 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14001 -gt 15400 ']' 00:09:32.041 00:09:32.041 real 0m5.251s 00:09:32.041 user 0m0.110s 00:09:32.041 sys 0m0.043s 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:32.041 17:21:42 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:09:32.041 ************************************ 00:09:32.041 END TEST bdev_qos_iops 00:09:32.041 ************************************ 00:09:32.041 17:21:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:32.041 17:21:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:09:32.041 17:21:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:32.041 17:21:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:32.041 17:21:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:32.041 17:21:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:32.041 17:21:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:32.041 17:21:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 20197.14 80788.57 0.00 0.00 81920.00 0.00 0.00 ' 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=81920.00 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 81920 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=81920 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:37.325 17:21:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:37.325 ************************************ 00:09:37.325 START TEST bdev_qos_bw 00:09:37.325 ************************************ 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:37.325 17:21:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2046.38 8185.52 0.00 0.00 8452.00 0.00 0.00 ' 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8452.00 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8452 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8452 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8452 -lt 7372 ']' 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8452 -gt 9011 ']' 00:09:42.671 00:09:42.671 real 0m5.299s 00:09:42.671 user 0m0.108s 00:09:42.671 sys 0m0.048s 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:09:42.671 ************************************ 00:09:42.671 END TEST bdev_qos_bw 00:09:42.671 ************************************ 00:09:42.671 17:21:53 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:42.671 17:21:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:09:42.671 17:21:53 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:42.671 17:21:53 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:42.671 17:21:53 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:42.671 17:21:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:09:42.671 17:21:53 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:42.671 17:21:53 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:42.671 17:21:53 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:42.671 ************************************ 00:09:42.671 START TEST bdev_qos_ro_bw 00:09:42.671 ************************************ 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:42.671 17:21:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 512.53 2050.13 0.00 0.00 2064.00 0.00 0.00 ' 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2064.00 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2064 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2064 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2064 -lt 1843 ']' 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2064 -gt 2252 ']' 00:09:47.958 00:09:47.958 real 0m5.187s 00:09:47.958 user 0m0.115s 00:09:47.958 sys 0m0.039s 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:47.958 17:21:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:09:47.958 ************************************ 00:09:47.958 END TEST bdev_qos_ro_bw 00:09:47.958 ************************************ 00:09:47.958 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:47.958 17:21:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:09:47.958 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:47.958 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:48.530 00:09:48.530 Latency(us) 00:09:48.530 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:48.530 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:48.530 Malloc_0 : 26.85 19498.29 76.17 0.00 0.00 13005.74 2268.55 503316.48 00:09:48.530 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:48.530 Null_1 : 27.00 19796.59 77.33 0.00 0.00 12898.77 920.02 154866.61 00:09:48.530 =================================================================================================================== 00:09:48.530 Total : 39294.88 153.50 0.00 0.00 12951.71 920.02 503316.48 00:09:48.530 0 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 2730654 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 2730654 ']' 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 2730654 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2730654 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2730654' 00:09:48.530 killing process with pid 2730654 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 2730654 00:09:48.530 Received shutdown signal, test time was about 27.066474 seconds 00:09:48.530 00:09:48.530 Latency(us) 00:09:48.530 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:48.530 =================================================================================================================== 00:09:48.530 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:48.530 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 2730654 00:09:48.792 17:21:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:09:48.792 00:09:48.792 real 0m28.073s 00:09:48.792 user 0m28.957s 00:09:48.792 sys 0m0.755s 00:09:48.792 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:48.792 17:21:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:48.792 ************************************ 00:09:48.792 END TEST bdev_qos 00:09:48.792 ************************************ 00:09:48.792 17:21:59 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:48.792 17:21:59 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:09:48.792 17:21:59 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:48.792 17:21:59 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:48.792 17:21:59 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:48.792 ************************************ 00:09:48.792 START TEST bdev_qd_sampling 00:09:48.792 ************************************ 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=2735325 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 2735325' 00:09:48.792 Process bdev QD sampling period testing pid: 2735325 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 2735325 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 2735325 ']' 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:48.792 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:48.792 17:21:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:48.792 [2024-07-15 17:22:00.076215] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:09:48.792 [2024-07-15 17:22:00.076353] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2735325 ] 00:09:49.053 [2024-07-15 17:22:00.220299] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:49.053 [2024-07-15 17:22:00.315958] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:49.053 [2024-07-15 17:22:00.316039] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:49.996 Malloc_QD 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.996 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:50.257 [ 00:09:50.257 { 00:09:50.257 "name": "Malloc_QD", 00:09:50.257 "aliases": [ 00:09:50.257 "11bc362b-5dd2-4218-9b03-d986a6c5aef4" 00:09:50.257 ], 00:09:50.257 "product_name": "Malloc disk", 00:09:50.257 "block_size": 512, 00:09:50.257 "num_blocks": 262144, 00:09:50.257 "uuid": "11bc362b-5dd2-4218-9b03-d986a6c5aef4", 00:09:50.257 "assigned_rate_limits": { 00:09:50.257 "rw_ios_per_sec": 0, 00:09:50.257 "rw_mbytes_per_sec": 0, 00:09:50.257 "r_mbytes_per_sec": 0, 00:09:50.257 "w_mbytes_per_sec": 0 00:09:50.257 }, 00:09:50.257 "claimed": false, 00:09:50.257 "zoned": false, 00:09:50.257 "supported_io_types": { 00:09:50.257 "read": true, 00:09:50.257 "write": true, 00:09:50.257 "unmap": true, 00:09:50.257 "flush": true, 00:09:50.257 "reset": true, 00:09:50.257 "nvme_admin": false, 00:09:50.257 "nvme_io": false, 00:09:50.257 "nvme_io_md": false, 00:09:50.257 "write_zeroes": true, 00:09:50.257 "zcopy": true, 00:09:50.257 "get_zone_info": false, 00:09:50.257 "zone_management": false, 00:09:50.257 "zone_append": false, 00:09:50.257 "compare": false, 00:09:50.257 "compare_and_write": false, 00:09:50.257 "abort": true, 00:09:50.257 "seek_hole": false, 00:09:50.257 "seek_data": false, 00:09:50.257 "copy": true, 00:09:50.257 "nvme_iov_md": false 00:09:50.257 }, 00:09:50.257 "memory_domains": [ 00:09:50.257 { 00:09:50.257 "dma_device_id": "system", 00:09:50.257 "dma_device_type": 1 00:09:50.257 }, 00:09:50.257 { 00:09:50.257 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:50.257 "dma_device_type": 2 00:09:50.257 } 00:09:50.257 ], 00:09:50.257 "driver_specific": {} 00:09:50.257 } 00:09:50.257 ] 00:09:50.257 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:50.257 17:22:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:09:50.257 17:22:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:09:50.257 17:22:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:50.257 Running I/O for 5 seconds... 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:09:52.167 "tick_rate": 2600000000, 00:09:52.167 "ticks": 12627822620913531, 00:09:52.167 "bdevs": [ 00:09:52.167 { 00:09:52.167 "name": "Malloc_QD", 00:09:52.167 "bytes_read": 1088467456, 00:09:52.167 "num_read_ops": 265732, 00:09:52.167 "bytes_written": 0, 00:09:52.167 "num_write_ops": 0, 00:09:52.167 "bytes_unmapped": 0, 00:09:52.167 "num_unmap_ops": 0, 00:09:52.167 "bytes_copied": 0, 00:09:52.167 "num_copy_ops": 0, 00:09:52.167 "read_latency_ticks": 2551753843508, 00:09:52.167 "max_read_latency_ticks": 11607150, 00:09:52.167 "min_read_latency_ticks": 298006, 00:09:52.167 "write_latency_ticks": 0, 00:09:52.167 "max_write_latency_ticks": 0, 00:09:52.167 "min_write_latency_ticks": 0, 00:09:52.167 "unmap_latency_ticks": 0, 00:09:52.167 "max_unmap_latency_ticks": 0, 00:09:52.167 "min_unmap_latency_ticks": 0, 00:09:52.167 "copy_latency_ticks": 0, 00:09:52.167 "max_copy_latency_ticks": 0, 00:09:52.167 "min_copy_latency_ticks": 0, 00:09:52.167 "io_error": {}, 00:09:52.167 "queue_depth_polling_period": 10, 00:09:52.167 "queue_depth": 512, 00:09:52.167 "io_time": 40, 00:09:52.167 "weighted_io_time": 25600 00:09:52.167 } 00:09:52.167 ] 00:09:52.167 }' 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:52.167 00:09:52.167 Latency(us) 00:09:52.167 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:52.167 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:52.167 Malloc_QD : 2.00 73465.59 286.97 0.00 0.00 3476.86 1039.75 3755.72 00:09:52.167 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:52.167 Malloc_QD : 2.00 64950.30 253.71 0.00 0.00 3932.43 1001.94 4486.70 00:09:52.167 =================================================================================================================== 00:09:52.167 Total : 138415.88 540.69 0.00 0.00 3690.73 1001.94 4486.70 00:09:52.167 0 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 2735325 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 2735325 ']' 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 2735325 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:52.167 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2735325 00:09:52.428 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:52.428 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:52.428 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2735325' 00:09:52.428 killing process with pid 2735325 00:09:52.428 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 2735325 00:09:52.428 Received shutdown signal, test time was about 2.078104 seconds 00:09:52.428 00:09:52.428 Latency(us) 00:09:52.428 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:52.428 =================================================================================================================== 00:09:52.428 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:52.428 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 2735325 00:09:52.428 17:22:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:09:52.428 00:09:52.428 real 0m3.679s 00:09:52.428 user 0m7.469s 00:09:52.428 sys 0m0.436s 00:09:52.428 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:52.428 17:22:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:52.428 ************************************ 00:09:52.428 END TEST bdev_qd_sampling 00:09:52.428 ************************************ 00:09:52.428 17:22:03 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:52.428 17:22:03 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:09:52.428 17:22:03 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:52.428 17:22:03 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.428 17:22:03 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:52.689 ************************************ 00:09:52.689 START TEST bdev_error 00:09:52.689 ************************************ 00:09:52.689 17:22:03 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:09:52.689 17:22:03 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:09:52.689 17:22:03 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:09:52.689 17:22:03 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:09:52.689 17:22:03 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=2735990 00:09:52.689 17:22:03 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 2735990' 00:09:52.689 Process error testing pid: 2735990 00:09:52.689 17:22:03 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 2735990 00:09:52.689 17:22:03 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:09:52.689 17:22:03 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2735990 ']' 00:09:52.689 17:22:03 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:52.689 17:22:03 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:52.689 17:22:03 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:52.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:52.689 17:22:03 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:52.689 17:22:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:52.689 [2024-07-15 17:22:03.796071] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:09:52.689 [2024-07-15 17:22:03.796130] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2735990 ] 00:09:52.689 [2024-07-15 17:22:03.877080] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:52.689 [2024-07-15 17:22:03.977660] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:09:53.629 17:22:04 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:53.629 Dev_1 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.629 17:22:04 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.629 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:53.629 [ 00:09:53.629 { 00:09:53.629 "name": "Dev_1", 00:09:53.629 "aliases": [ 00:09:53.629 "e08c62c4-3e99-4417-8dc9-0134d79a4df1" 00:09:53.629 ], 00:09:53.629 "product_name": "Malloc disk", 00:09:53.629 "block_size": 512, 00:09:53.629 "num_blocks": 262144, 00:09:53.629 "uuid": "e08c62c4-3e99-4417-8dc9-0134d79a4df1", 00:09:53.629 "assigned_rate_limits": { 00:09:53.629 "rw_ios_per_sec": 0, 00:09:53.629 "rw_mbytes_per_sec": 0, 00:09:53.629 "r_mbytes_per_sec": 0, 00:09:53.629 "w_mbytes_per_sec": 0 00:09:53.629 }, 00:09:53.629 "claimed": false, 00:09:53.629 "zoned": false, 00:09:53.629 "supported_io_types": { 00:09:53.629 "read": true, 00:09:53.629 "write": true, 00:09:53.629 "unmap": true, 00:09:53.629 "flush": true, 00:09:53.629 "reset": true, 00:09:53.629 "nvme_admin": false, 00:09:53.629 "nvme_io": false, 00:09:53.629 "nvme_io_md": false, 00:09:53.629 "write_zeroes": true, 00:09:53.629 "zcopy": true, 00:09:53.629 "get_zone_info": false, 00:09:53.630 "zone_management": false, 00:09:53.630 "zone_append": false, 00:09:53.630 "compare": false, 00:09:53.630 "compare_and_write": false, 00:09:53.630 "abort": true, 00:09:53.630 "seek_hole": false, 00:09:53.630 "seek_data": false, 00:09:53.630 "copy": true, 00:09:53.630 "nvme_iov_md": false 00:09:53.630 }, 00:09:53.630 "memory_domains": [ 00:09:53.630 { 00:09:53.630 "dma_device_id": "system", 00:09:53.630 "dma_device_type": 1 00:09:53.630 }, 00:09:53.630 { 00:09:53.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:53.630 "dma_device_type": 2 00:09:53.630 } 00:09:53.630 ], 00:09:53.630 "driver_specific": {} 00:09:53.630 } 00:09:53.630 ] 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:53.630 17:22:04 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:53.630 true 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.630 17:22:04 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:53.630 Dev_2 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.630 17:22:04 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:53.630 [ 00:09:53.630 { 00:09:53.630 "name": "Dev_2", 00:09:53.630 "aliases": [ 00:09:53.630 "8b7b7c2c-d4a2-467a-a960-e0a921cae2b2" 00:09:53.630 ], 00:09:53.630 "product_name": "Malloc disk", 00:09:53.630 "block_size": 512, 00:09:53.630 "num_blocks": 262144, 00:09:53.630 "uuid": "8b7b7c2c-d4a2-467a-a960-e0a921cae2b2", 00:09:53.630 "assigned_rate_limits": { 00:09:53.630 "rw_ios_per_sec": 0, 00:09:53.630 "rw_mbytes_per_sec": 0, 00:09:53.630 "r_mbytes_per_sec": 0, 00:09:53.630 "w_mbytes_per_sec": 0 00:09:53.630 }, 00:09:53.630 "claimed": false, 00:09:53.630 "zoned": false, 00:09:53.630 "supported_io_types": { 00:09:53.630 "read": true, 00:09:53.630 "write": true, 00:09:53.630 "unmap": true, 00:09:53.630 "flush": true, 00:09:53.630 "reset": true, 00:09:53.630 "nvme_admin": false, 00:09:53.630 "nvme_io": false, 00:09:53.630 "nvme_io_md": false, 00:09:53.630 "write_zeroes": true, 00:09:53.630 "zcopy": true, 00:09:53.630 "get_zone_info": false, 00:09:53.630 "zone_management": false, 00:09:53.630 "zone_append": false, 00:09:53.630 "compare": false, 00:09:53.630 "compare_and_write": false, 00:09:53.630 "abort": true, 00:09:53.630 "seek_hole": false, 00:09:53.630 "seek_data": false, 00:09:53.630 "copy": true, 00:09:53.630 "nvme_iov_md": false 00:09:53.630 }, 00:09:53.630 "memory_domains": [ 00:09:53.630 { 00:09:53.630 "dma_device_id": "system", 00:09:53.630 "dma_device_type": 1 00:09:53.630 }, 00:09:53.630 { 00:09:53.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:53.630 "dma_device_type": 2 00:09:53.630 } 00:09:53.630 ], 00:09:53.630 "driver_specific": {} 00:09:53.630 } 00:09:53.630 ] 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:53.630 17:22:04 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:53.630 17:22:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.630 17:22:04 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:09:53.630 17:22:04 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:53.630 Running I/O for 5 seconds... 00:09:54.572 17:22:05 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 2735990 00:09:54.572 17:22:05 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 2735990' 00:09:54.572 Process is existed as continue on error is set. Pid: 2735990 00:09:54.572 17:22:05 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:09:54.572 17:22:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.572 17:22:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:54.572 17:22:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.572 17:22:05 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:09:54.572 17:22:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.572 17:22:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:54.572 17:22:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.572 17:22:05 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:09:54.832 Timeout while waiting for response: 00:09:54.832 00:09:54.832 00:09:59.032 00:09:59.032 Latency(us) 00:09:59.032 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:59.032 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:59.032 EE_Dev_1 : 0.91 35080.62 137.03 5.51 0.00 452.41 151.24 734.13 00:09:59.032 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:59.032 Dev_2 : 5.00 75464.71 294.78 0.00 0.00 208.38 70.89 18854.20 00:09:59.032 =================================================================================================================== 00:09:59.032 Total : 110545.33 431.82 5.51 0.00 227.36 70.89 18854.20 00:09:59.603 17:22:10 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 2735990 00:09:59.603 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 2735990 ']' 00:09:59.603 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 2735990 00:09:59.603 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:09:59.603 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:59.603 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2735990 00:09:59.603 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:09:59.603 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:09:59.603 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2735990' 00:09:59.603 killing process with pid 2735990 00:09:59.603 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 2735990 00:09:59.603 Received shutdown signal, test time was about 5.000000 seconds 00:09:59.604 00:09:59.604 Latency(us) 00:09:59.604 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:59.604 =================================================================================================================== 00:09:59.604 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:59.604 17:22:10 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 2735990 00:09:59.865 17:22:11 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=2737179 00:09:59.865 17:22:11 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 2737179' 00:09:59.865 Process error testing pid: 2737179 00:09:59.865 17:22:11 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 2737179 00:09:59.865 17:22:11 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:09:59.865 17:22:11 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2737179 ']' 00:09:59.865 17:22:11 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:59.865 17:22:11 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:59.865 17:22:11 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:59.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:59.865 17:22:11 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:59.865 17:22:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:59.865 [2024-07-15 17:22:11.145937] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:09:59.865 [2024-07-15 17:22:11.146005] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2737179 ] 00:10:00.126 [2024-07-15 17:22:11.227719] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:00.126 [2024-07-15 17:22:11.327992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:01.081 17:22:12 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.081 Dev_1 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.081 17:22:12 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.081 [ 00:10:01.081 { 00:10:01.081 "name": "Dev_1", 00:10:01.081 "aliases": [ 00:10:01.081 "2de908c2-0ba8-4f1b-88de-cfd21442224f" 00:10:01.081 ], 00:10:01.081 "product_name": "Malloc disk", 00:10:01.081 "block_size": 512, 00:10:01.081 "num_blocks": 262144, 00:10:01.081 "uuid": "2de908c2-0ba8-4f1b-88de-cfd21442224f", 00:10:01.081 "assigned_rate_limits": { 00:10:01.081 "rw_ios_per_sec": 0, 00:10:01.081 "rw_mbytes_per_sec": 0, 00:10:01.081 "r_mbytes_per_sec": 0, 00:10:01.081 "w_mbytes_per_sec": 0 00:10:01.081 }, 00:10:01.081 "claimed": false, 00:10:01.081 "zoned": false, 00:10:01.081 "supported_io_types": { 00:10:01.081 "read": true, 00:10:01.081 "write": true, 00:10:01.081 "unmap": true, 00:10:01.081 "flush": true, 00:10:01.081 "reset": true, 00:10:01.081 "nvme_admin": false, 00:10:01.081 "nvme_io": false, 00:10:01.081 "nvme_io_md": false, 00:10:01.081 "write_zeroes": true, 00:10:01.081 "zcopy": true, 00:10:01.081 "get_zone_info": false, 00:10:01.081 "zone_management": false, 00:10:01.081 "zone_append": false, 00:10:01.081 "compare": false, 00:10:01.081 "compare_and_write": false, 00:10:01.081 "abort": true, 00:10:01.081 "seek_hole": false, 00:10:01.081 "seek_data": false, 00:10:01.081 "copy": true, 00:10:01.081 "nvme_iov_md": false 00:10:01.081 }, 00:10:01.081 "memory_domains": [ 00:10:01.081 { 00:10:01.081 "dma_device_id": "system", 00:10:01.081 "dma_device_type": 1 00:10:01.081 }, 00:10:01.081 { 00:10:01.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:01.081 "dma_device_type": 2 00:10:01.081 } 00:10:01.081 ], 00:10:01.081 "driver_specific": {} 00:10:01.081 } 00:10:01.081 ] 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:01.081 17:22:12 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.081 true 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.081 17:22:12 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.081 Dev_2 00:10:01.081 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.082 17:22:12 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.082 [ 00:10:01.082 { 00:10:01.082 "name": "Dev_2", 00:10:01.082 "aliases": [ 00:10:01.082 "39cab6bd-f2f6-4759-a430-ebd7838d5c2c" 00:10:01.082 ], 00:10:01.082 "product_name": "Malloc disk", 00:10:01.082 "block_size": 512, 00:10:01.082 "num_blocks": 262144, 00:10:01.082 "uuid": "39cab6bd-f2f6-4759-a430-ebd7838d5c2c", 00:10:01.082 "assigned_rate_limits": { 00:10:01.082 "rw_ios_per_sec": 0, 00:10:01.082 "rw_mbytes_per_sec": 0, 00:10:01.082 "r_mbytes_per_sec": 0, 00:10:01.082 "w_mbytes_per_sec": 0 00:10:01.082 }, 00:10:01.082 "claimed": false, 00:10:01.082 "zoned": false, 00:10:01.082 "supported_io_types": { 00:10:01.082 "read": true, 00:10:01.082 "write": true, 00:10:01.082 "unmap": true, 00:10:01.082 "flush": true, 00:10:01.082 "reset": true, 00:10:01.082 "nvme_admin": false, 00:10:01.082 "nvme_io": false, 00:10:01.082 "nvme_io_md": false, 00:10:01.082 "write_zeroes": true, 00:10:01.082 "zcopy": true, 00:10:01.082 "get_zone_info": false, 00:10:01.082 "zone_management": false, 00:10:01.082 "zone_append": false, 00:10:01.082 "compare": false, 00:10:01.082 "compare_and_write": false, 00:10:01.082 "abort": true, 00:10:01.082 "seek_hole": false, 00:10:01.082 "seek_data": false, 00:10:01.082 "copy": true, 00:10:01.082 "nvme_iov_md": false 00:10:01.082 }, 00:10:01.082 "memory_domains": [ 00:10:01.082 { 00:10:01.082 "dma_device_id": "system", 00:10:01.082 "dma_device_type": 1 00:10:01.082 }, 00:10:01.082 { 00:10:01.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:01.082 "dma_device_type": 2 00:10:01.082 } 00:10:01.082 ], 00:10:01.082 "driver_specific": {} 00:10:01.082 } 00:10:01.082 ] 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:01.082 17:22:12 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.082 17:22:12 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 2737179 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 2737179 00:10:01.082 17:22:12 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:01.082 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 2737179 00:10:01.082 Running I/O for 5 seconds... 00:10:01.082 task offset: 117216 on job bdev=EE_Dev_1 fails 00:10:01.082 00:10:01.082 Latency(us) 00:10:01.082 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:01.082 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:01.082 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:01.082 EE_Dev_1 : 0.00 27027.03 105.57 6142.51 0.00 400.95 156.75 718.38 00:10:01.082 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:01.082 Dev_2 : 0.00 17003.19 66.42 0.00 0.00 691.15 153.60 1272.91 00:10:01.082 =================================================================================================================== 00:10:01.082 Total : 44030.22 171.99 6142.51 0.00 558.35 153.60 1272.91 00:10:01.082 [2024-07-15 17:22:12.287152] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:01.082 request: 00:10:01.082 { 00:10:01.082 "method": "perform_tests", 00:10:01.082 "req_id": 1 00:10:01.082 } 00:10:01.082 Got JSON-RPC error response 00:10:01.082 response: 00:10:01.082 { 00:10:01.082 "code": -32603, 00:10:01.082 "message": "bdevperf failed with error Operation not permitted" 00:10:01.082 } 00:10:01.400 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:01.400 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:01.400 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:01.400 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:01.400 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:01.400 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:01.400 00:10:01.400 real 0m8.758s 00:10:01.400 user 0m9.114s 00:10:01.400 sys 0m0.754s 00:10:01.400 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:01.400 17:22:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.400 ************************************ 00:10:01.400 END TEST bdev_error 00:10:01.400 ************************************ 00:10:01.400 17:22:12 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:01.400 17:22:12 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:01.400 17:22:12 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:01.400 17:22:12 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:01.400 17:22:12 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:01.400 ************************************ 00:10:01.400 START TEST bdev_stat 00:10:01.400 ************************************ 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=2737409 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 2737409' 00:10:01.400 Process Bdev IO statistics testing pid: 2737409 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 2737409 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 2737409 ']' 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:01.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:01.400 17:22:12 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:01.400 [2024-07-15 17:22:12.624948] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:01.400 [2024-07-15 17:22:12.625006] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2737409 ] 00:10:01.661 [2024-07-15 17:22:12.716548] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:01.661 [2024-07-15 17:22:12.810766] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:01.661 [2024-07-15 17:22:12.810834] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:02.232 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:02.232 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:02.232 17:22:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:02.232 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.232 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:02.492 Malloc_STAT 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:02.492 [ 00:10:02.492 { 00:10:02.492 "name": "Malloc_STAT", 00:10:02.492 "aliases": [ 00:10:02.492 "68111de6-286b-4acd-90c2-9933e13609e5" 00:10:02.492 ], 00:10:02.492 "product_name": "Malloc disk", 00:10:02.492 "block_size": 512, 00:10:02.492 "num_blocks": 262144, 00:10:02.492 "uuid": "68111de6-286b-4acd-90c2-9933e13609e5", 00:10:02.492 "assigned_rate_limits": { 00:10:02.492 "rw_ios_per_sec": 0, 00:10:02.492 "rw_mbytes_per_sec": 0, 00:10:02.492 "r_mbytes_per_sec": 0, 00:10:02.492 "w_mbytes_per_sec": 0 00:10:02.492 }, 00:10:02.492 "claimed": false, 00:10:02.492 "zoned": false, 00:10:02.492 "supported_io_types": { 00:10:02.492 "read": true, 00:10:02.492 "write": true, 00:10:02.492 "unmap": true, 00:10:02.492 "flush": true, 00:10:02.492 "reset": true, 00:10:02.492 "nvme_admin": false, 00:10:02.492 "nvme_io": false, 00:10:02.492 "nvme_io_md": false, 00:10:02.492 "write_zeroes": true, 00:10:02.492 "zcopy": true, 00:10:02.492 "get_zone_info": false, 00:10:02.492 "zone_management": false, 00:10:02.492 "zone_append": false, 00:10:02.492 "compare": false, 00:10:02.492 "compare_and_write": false, 00:10:02.492 "abort": true, 00:10:02.492 "seek_hole": false, 00:10:02.492 "seek_data": false, 00:10:02.492 "copy": true, 00:10:02.492 "nvme_iov_md": false 00:10:02.492 }, 00:10:02.492 "memory_domains": [ 00:10:02.492 { 00:10:02.492 "dma_device_id": "system", 00:10:02.492 "dma_device_type": 1 00:10:02.492 }, 00:10:02.492 { 00:10:02.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:02.492 "dma_device_type": 2 00:10:02.492 } 00:10:02.492 ], 00:10:02.492 "driver_specific": {} 00:10:02.492 } 00:10:02.492 ] 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:02.492 17:22:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:02.492 Running I/O for 10 seconds... 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:04.404 "tick_rate": 2600000000, 00:10:04.404 "ticks": 12627854387286263, 00:10:04.404 "bdevs": [ 00:10:04.404 { 00:10:04.404 "name": "Malloc_STAT", 00:10:04.404 "bytes_read": 1092661760, 00:10:04.404 "num_read_ops": 266756, 00:10:04.404 "bytes_written": 0, 00:10:04.404 "num_write_ops": 0, 00:10:04.404 "bytes_unmapped": 0, 00:10:04.404 "num_unmap_ops": 0, 00:10:04.404 "bytes_copied": 0, 00:10:04.404 "num_copy_ops": 0, 00:10:04.404 "read_latency_ticks": 2564109276250, 00:10:04.404 "max_read_latency_ticks": 13009220, 00:10:04.404 "min_read_latency_ticks": 264902, 00:10:04.404 "write_latency_ticks": 0, 00:10:04.404 "max_write_latency_ticks": 0, 00:10:04.404 "min_write_latency_ticks": 0, 00:10:04.404 "unmap_latency_ticks": 0, 00:10:04.404 "max_unmap_latency_ticks": 0, 00:10:04.404 "min_unmap_latency_ticks": 0, 00:10:04.404 "copy_latency_ticks": 0, 00:10:04.404 "max_copy_latency_ticks": 0, 00:10:04.404 "min_copy_latency_ticks": 0, 00:10:04.404 "io_error": {} 00:10:04.404 } 00:10:04.404 ] 00:10:04.404 }' 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:04.404 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=266756 00:10:04.405 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:04.405 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.405 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:04.405 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.405 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:04.405 "tick_rate": 2600000000, 00:10:04.405 "ticks": 12627854569112957, 00:10:04.405 "name": "Malloc_STAT", 00:10:04.405 "channels": [ 00:10:04.405 { 00:10:04.405 "thread_id": 2, 00:10:04.405 "bytes_read": 600834048, 00:10:04.405 "num_read_ops": 146688, 00:10:04.405 "bytes_written": 0, 00:10:04.405 "num_write_ops": 0, 00:10:04.405 "bytes_unmapped": 0, 00:10:04.405 "num_unmap_ops": 0, 00:10:04.405 "bytes_copied": 0, 00:10:04.405 "num_copy_ops": 0, 00:10:04.405 "read_latency_ticks": 1328361139336, 00:10:04.405 "max_read_latency_ticks": 9645696, 00:10:04.405 "min_read_latency_ticks": 6995418, 00:10:04.405 "write_latency_ticks": 0, 00:10:04.405 "max_write_latency_ticks": 0, 00:10:04.405 "min_write_latency_ticks": 0, 00:10:04.405 "unmap_latency_ticks": 0, 00:10:04.405 "max_unmap_latency_ticks": 0, 00:10:04.405 "min_unmap_latency_ticks": 0, 00:10:04.405 "copy_latency_ticks": 0, 00:10:04.405 "max_copy_latency_ticks": 0, 00:10:04.405 "min_copy_latency_ticks": 0 00:10:04.405 }, 00:10:04.405 { 00:10:04.405 "thread_id": 3, 00:10:04.405 "bytes_read": 531628032, 00:10:04.405 "num_read_ops": 129792, 00:10:04.405 "bytes_written": 0, 00:10:04.405 "num_write_ops": 0, 00:10:04.405 "bytes_unmapped": 0, 00:10:04.405 "num_unmap_ops": 0, 00:10:04.405 "bytes_copied": 0, 00:10:04.405 "num_copy_ops": 0, 00:10:04.405 "read_latency_ticks": 1329295279616, 00:10:04.405 "max_read_latency_ticks": 13009220, 00:10:04.405 "min_read_latency_ticks": 7720446, 00:10:04.405 "write_latency_ticks": 0, 00:10:04.405 "max_write_latency_ticks": 0, 00:10:04.405 "min_write_latency_ticks": 0, 00:10:04.405 "unmap_latency_ticks": 0, 00:10:04.405 "max_unmap_latency_ticks": 0, 00:10:04.405 "min_unmap_latency_ticks": 0, 00:10:04.405 "copy_latency_ticks": 0, 00:10:04.405 "max_copy_latency_ticks": 0, 00:10:04.405 "min_copy_latency_ticks": 0 00:10:04.405 } 00:10:04.405 ] 00:10:04.405 }' 00:10:04.405 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=146688 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=146688 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=129792 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=276480 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:04.667 "tick_rate": 2600000000, 00:10:04.667 "ticks": 12627854878150033, 00:10:04.667 "bdevs": [ 00:10:04.667 { 00:10:04.667 "name": "Malloc_STAT", 00:10:04.667 "bytes_read": 1199616512, 00:10:04.667 "num_read_ops": 292868, 00:10:04.667 "bytes_written": 0, 00:10:04.667 "num_write_ops": 0, 00:10:04.667 "bytes_unmapped": 0, 00:10:04.667 "num_unmap_ops": 0, 00:10:04.667 "bytes_copied": 0, 00:10:04.667 "num_copy_ops": 0, 00:10:04.667 "read_latency_ticks": 2815387853176, 00:10:04.667 "max_read_latency_ticks": 13009220, 00:10:04.667 "min_read_latency_ticks": 264902, 00:10:04.667 "write_latency_ticks": 0, 00:10:04.667 "max_write_latency_ticks": 0, 00:10:04.667 "min_write_latency_ticks": 0, 00:10:04.667 "unmap_latency_ticks": 0, 00:10:04.667 "max_unmap_latency_ticks": 0, 00:10:04.667 "min_unmap_latency_ticks": 0, 00:10:04.667 "copy_latency_ticks": 0, 00:10:04.667 "max_copy_latency_ticks": 0, 00:10:04.667 "min_copy_latency_ticks": 0, 00:10:04.667 "io_error": {} 00:10:04.667 } 00:10:04.667 ] 00:10:04.667 }' 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=292868 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 276480 -lt 266756 ']' 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 276480 -gt 292868 ']' 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:04.667 00:10:04.667 Latency(us) 00:10:04.667 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:04.667 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:04.667 Malloc_STAT : 2.19 73383.42 286.65 0.00 0.00 3481.53 1008.25 3730.51 00:10:04.667 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:04.667 Malloc_STAT : 2.20 64909.84 253.55 0.00 0.00 3935.33 1033.45 5016.02 00:10:04.667 =================================================================================================================== 00:10:04.667 Total : 138293.27 540.21 0.00 0.00 3694.66 1008.25 5016.02 00:10:04.667 0 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 2737409 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 2737409 ']' 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 2737409 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2737409 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2737409' 00:10:04.667 killing process with pid 2737409 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 2737409 00:10:04.667 Received shutdown signal, test time was about 2.277729 seconds 00:10:04.667 00:10:04.667 Latency(us) 00:10:04.667 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:04.667 =================================================================================================================== 00:10:04.667 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:04.667 17:22:15 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 2737409 00:10:04.927 17:22:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:04.927 00:10:04.927 real 0m3.521s 00:10:04.927 user 0m7.115s 00:10:04.927 sys 0m0.379s 00:10:04.927 17:22:16 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:04.927 17:22:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:04.927 ************************************ 00:10:04.927 END TEST bdev_stat 00:10:04.927 ************************************ 00:10:04.927 17:22:16 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:04.927 17:22:16 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:04.927 00:10:04.927 real 1m50.299s 00:10:04.927 user 7m16.860s 00:10:04.927 sys 0m16.478s 00:10:04.927 17:22:16 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:04.927 17:22:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:04.927 ************************************ 00:10:04.927 END TEST blockdev_general 00:10:04.927 ************************************ 00:10:04.927 17:22:16 -- common/autotest_common.sh@1142 -- # return 0 00:10:04.927 17:22:16 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:04.927 17:22:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:04.927 17:22:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:04.927 17:22:16 -- common/autotest_common.sh@10 -- # set +x 00:10:05.188 ************************************ 00:10:05.188 START TEST bdev_raid 00:10:05.188 ************************************ 00:10:05.188 17:22:16 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:05.188 * Looking for test storage... 00:10:05.188 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:05.188 17:22:16 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:05.188 17:22:16 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:05.188 17:22:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:05.188 17:22:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:05.188 17:22:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:05.188 ************************************ 00:10:05.188 START TEST raid_function_test_raid0 00:10:05.188 ************************************ 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=2738115 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2738115' 00:10:05.188 Process raid pid: 2738115 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 2738115 /var/tmp/spdk-raid.sock 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 2738115 ']' 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:05.188 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:05.188 17:22:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:05.188 [2024-07-15 17:22:16.461757] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:05.188 [2024-07-15 17:22:16.461815] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:05.449 [2024-07-15 17:22:16.556741] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:05.449 [2024-07-15 17:22:16.652288] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:05.449 [2024-07-15 17:22:16.714792] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:05.449 [2024-07-15 17:22:16.714825] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:06.390 [2024-07-15 17:22:17.549191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:06.390 [2024-07-15 17:22:17.550319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:06.390 [2024-07-15 17:22:17.550379] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c7820 00:10:06.390 [2024-07-15 17:22:17.550385] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:06.390 [2024-07-15 17:22:17.550607] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16cb170 00:10:06.390 [2024-07-15 17:22:17.550708] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c7820 00:10:06.390 [2024-07-15 17:22:17.550749] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x16c7820 00:10:06.390 [2024-07-15 17:22:17.550839] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:06.390 Base_1 00:10:06.390 Base_2 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:06.390 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:06.650 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:06.650 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:06.650 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:06.650 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:06.651 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:06.651 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:06.651 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:06.651 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:06.651 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:06.651 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:06.651 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:06.651 17:22:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:07.221 [2024-07-15 17:22:18.315148] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16caf80 00:10:07.221 /dev/nbd0 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:07.221 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:07.221 1+0 records in 00:10:07.221 1+0 records out 00:10:07.221 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310763 s, 13.2 MB/s 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:07.222 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:07.482 { 00:10:07.482 "nbd_device": "/dev/nbd0", 00:10:07.482 "bdev_name": "raid" 00:10:07.482 } 00:10:07.482 ]' 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:07.482 { 00:10:07.482 "nbd_device": "/dev/nbd0", 00:10:07.482 "bdev_name": "raid" 00:10:07.482 } 00:10:07.482 ]' 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:07.482 4096+0 records in 00:10:07.482 4096+0 records out 00:10:07.482 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0269659 s, 77.8 MB/s 00:10:07.482 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:07.743 4096+0 records in 00:10:07.743 4096+0 records out 00:10:07.743 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.229768 s, 9.1 MB/s 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:07.743 128+0 records in 00:10:07.743 128+0 records out 00:10:07.743 65536 bytes (66 kB, 64 KiB) copied, 0.000378688 s, 173 MB/s 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:07.743 2035+0 records in 00:10:07.743 2035+0 records out 00:10:07.743 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00483077 s, 216 MB/s 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:07.743 17:22:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:07.743 456+0 records in 00:10:07.743 456+0 records out 00:10:07.743 233472 bytes (233 kB, 228 KiB) copied, 0.00114064 s, 205 MB/s 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:07.743 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:08.003 [2024-07-15 17:22:19.234541] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:08.003 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 2738115 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 2738115 ']' 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 2738115 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:08.264 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2738115 00:10:08.525 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:08.525 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:08.525 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2738115' 00:10:08.525 killing process with pid 2738115 00:10:08.525 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 2738115 00:10:08.525 [2024-07-15 17:22:19.579805] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:08.525 [2024-07-15 17:22:19.579873] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:08.525 [2024-07-15 17:22:19.579908] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:08.525 [2024-07-15 17:22:19.579917] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c7820 name raid, state offline 00:10:08.525 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 2738115 00:10:08.525 [2024-07-15 17:22:19.595889] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:08.525 17:22:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:10:08.525 00:10:08.525 real 0m3.335s 00:10:08.525 user 0m4.713s 00:10:08.525 sys 0m0.969s 00:10:08.525 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:08.525 17:22:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:08.525 ************************************ 00:10:08.525 END TEST raid_function_test_raid0 00:10:08.525 ************************************ 00:10:08.525 17:22:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:08.525 17:22:19 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:10:08.525 17:22:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:08.525 17:22:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:08.525 17:22:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:08.525 ************************************ 00:10:08.525 START TEST raid_function_test_concat 00:10:08.525 ************************************ 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=2738794 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2738794' 00:10:08.525 Process raid pid: 2738794 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 2738794 /var/tmp/spdk-raid.sock 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 2738794 ']' 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:08.525 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:08.525 17:22:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:08.785 [2024-07-15 17:22:19.866637] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:08.786 [2024-07-15 17:22:19.866688] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:08.786 [2024-07-15 17:22:19.958729] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:08.786 [2024-07-15 17:22:20.030800] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:08.786 [2024-07-15 17:22:20.074128] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:08.786 [2024-07-15 17:22:20.074152] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:09.726 17:22:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:09.726 17:22:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:10:09.726 17:22:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:10:09.726 17:22:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:10:09.726 17:22:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:09.726 17:22:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:10:09.726 17:22:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:09.726 [2024-07-15 17:22:20.916050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:09.726 [2024-07-15 17:22:20.917054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:09.726 [2024-07-15 17:22:20.917094] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x240f820 00:10:09.726 [2024-07-15 17:22:20.917100] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:09.727 [2024-07-15 17:22:20.917282] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24131b0 00:10:09.727 [2024-07-15 17:22:20.917372] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x240f820 00:10:09.727 [2024-07-15 17:22:20.917378] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x240f820 00:10:09.727 [2024-07-15 17:22:20.917453] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:09.727 Base_1 00:10:09.727 Base_2 00:10:09.727 17:22:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:09.727 17:22:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:09.727 17:22:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:09.987 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:10.558 [2024-07-15 17:22:21.649915] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2412fc0 00:10:10.558 /dev/nbd0 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:10.558 1+0 records in 00:10:10.558 1+0 records out 00:10:10.558 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276858 s, 14.8 MB/s 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:10.558 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:10.818 { 00:10:10.818 "nbd_device": "/dev/nbd0", 00:10:10.818 "bdev_name": "raid" 00:10:10.818 } 00:10:10.818 ]' 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:10.818 { 00:10:10.818 "nbd_device": "/dev/nbd0", 00:10:10.818 "bdev_name": "raid" 00:10:10.818 } 00:10:10.818 ]' 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:10.818 17:22:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:10.818 4096+0 records in 00:10:10.818 4096+0 records out 00:10:10.818 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0256519 s, 81.8 MB/s 00:10:10.818 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:11.079 4096+0 records in 00:10:11.079 4096+0 records out 00:10:11.079 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.169723 s, 12.4 MB/s 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:11.079 128+0 records in 00:10:11.079 128+0 records out 00:10:11.079 65536 bytes (66 kB, 64 KiB) copied, 0.000379987 s, 172 MB/s 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:11.079 2035+0 records in 00:10:11.079 2035+0 records out 00:10:11.079 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00480566 s, 217 MB/s 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:11.079 456+0 records in 00:10:11.079 456+0 records out 00:10:11.079 233472 bytes (233 kB, 228 KiB) copied, 0.00114777 s, 203 MB/s 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:11.079 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:11.340 [2024-07-15 17:22:22.452612] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:11.340 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 2738794 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 2738794 ']' 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 2738794 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2738794 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2738794' 00:10:11.601 killing process with pid 2738794 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 2738794 00:10:11.601 [2024-07-15 17:22:22.765101] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:11.601 [2024-07-15 17:22:22.765145] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:11.601 [2024-07-15 17:22:22.765171] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:11.601 [2024-07-15 17:22:22.765177] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x240f820 name raid, state offline 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 2738794 00:10:11.601 [2024-07-15 17:22:22.774322] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:10:11.601 00:10:11.601 real 0m3.078s 00:10:11.601 user 0m4.470s 00:10:11.601 sys 0m0.818s 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:11.601 17:22:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:11.601 ************************************ 00:10:11.601 END TEST raid_function_test_concat 00:10:11.601 ************************************ 00:10:11.862 17:22:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:11.862 17:22:22 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:10:11.862 17:22:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:11.862 17:22:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:11.862 17:22:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:11.862 ************************************ 00:10:11.862 START TEST raid0_resize_test 00:10:11.862 ************************************ 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=2739483 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 2739483' 00:10:11.862 Process raid pid: 2739483 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 2739483 /var/tmp/spdk-raid.sock 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 2739483 ']' 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:11.862 17:22:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:11.863 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:11.863 17:22:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:11.863 17:22:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:11.863 [2024-07-15 17:22:23.018404] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:11.863 [2024-07-15 17:22:23.018453] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:11.863 [2024-07-15 17:22:23.109144] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:12.123 [2024-07-15 17:22:23.177257] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:12.123 [2024-07-15 17:22:23.216462] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:12.123 [2024-07-15 17:22:23.216485] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:13.066 17:22:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:13.066 17:22:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:10:13.066 17:22:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:13.326 Base_1 00:10:13.326 17:22:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:13.326 Base_2 00:10:13.326 17:22:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:13.587 [2024-07-15 17:22:24.781192] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:13.587 [2024-07-15 17:22:24.782316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:13.587 [2024-07-15 17:22:24.782356] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14a09d0 00:10:13.587 [2024-07-15 17:22:24.782362] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:13.587 [2024-07-15 17:22:24.782515] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a0cb0 00:10:13.587 [2024-07-15 17:22:24.782581] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14a09d0 00:10:13.587 [2024-07-15 17:22:24.782586] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x14a09d0 00:10:13.587 [2024-07-15 17:22:24.782656] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:13.587 17:22:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:13.847 [2024-07-15 17:22:24.993748] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:13.847 [2024-07-15 17:22:24.993773] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:13.847 true 00:10:13.847 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:13.847 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:10:14.107 [2024-07-15 17:22:25.206378] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:14.107 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:10:14.107 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:10:14.107 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:10:14.107 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:14.107 [2024-07-15 17:22:25.398732] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:14.107 [2024-07-15 17:22:25.398744] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:14.107 [2024-07-15 17:22:25.398759] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:14.107 true 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:10:14.368 [2024-07-15 17:22:25.587329] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 2739483 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 2739483 ']' 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 2739483 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2739483 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2739483' 00:10:14.368 killing process with pid 2739483 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 2739483 00:10:14.368 [2024-07-15 17:22:25.656210] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:14.368 [2024-07-15 17:22:25.656250] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:14.368 [2024-07-15 17:22:25.656280] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:14.368 [2024-07-15 17:22:25.656285] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a09d0 name Raid, state offline 00:10:14.368 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 2739483 00:10:14.368 [2024-07-15 17:22:25.657200] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:14.630 17:22:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:10:14.630 00:10:14.630 real 0m2.804s 00:10:14.630 user 0m4.514s 00:10:14.630 sys 0m0.468s 00:10:14.630 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:14.630 17:22:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:14.630 ************************************ 00:10:14.630 END TEST raid0_resize_test 00:10:14.630 ************************************ 00:10:14.630 17:22:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:14.630 17:22:25 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:14.630 17:22:25 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:14.630 17:22:25 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:14.630 17:22:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:14.630 17:22:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:14.630 17:22:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:14.630 ************************************ 00:10:14.630 START TEST raid_state_function_test 00:10:14.630 ************************************ 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2739967 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2739967' 00:10:14.630 Process raid pid: 2739967 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2739967 /var/tmp/spdk-raid.sock 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2739967 ']' 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:14.630 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:14.630 17:22:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:14.630 [2024-07-15 17:22:25.903319] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:14.630 [2024-07-15 17:22:25.903370] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:14.890 [2024-07-15 17:22:25.993396] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:14.890 [2024-07-15 17:22:26.062097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:14.890 [2024-07-15 17:22:26.113918] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:14.890 [2024-07-15 17:22:26.113941] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:15.461 17:22:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:15.461 17:22:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:15.461 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:15.743 [2024-07-15 17:22:26.913399] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:15.744 [2024-07-15 17:22:26.913429] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:15.744 [2024-07-15 17:22:26.913435] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:15.744 [2024-07-15 17:22:26.913441] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:15.744 17:22:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:16.012 17:22:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:16.012 "name": "Existed_Raid", 00:10:16.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:16.012 "strip_size_kb": 64, 00:10:16.012 "state": "configuring", 00:10:16.012 "raid_level": "raid0", 00:10:16.012 "superblock": false, 00:10:16.012 "num_base_bdevs": 2, 00:10:16.012 "num_base_bdevs_discovered": 0, 00:10:16.012 "num_base_bdevs_operational": 2, 00:10:16.012 "base_bdevs_list": [ 00:10:16.012 { 00:10:16.012 "name": "BaseBdev1", 00:10:16.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:16.012 "is_configured": false, 00:10:16.012 "data_offset": 0, 00:10:16.012 "data_size": 0 00:10:16.012 }, 00:10:16.012 { 00:10:16.012 "name": "BaseBdev2", 00:10:16.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:16.012 "is_configured": false, 00:10:16.012 "data_offset": 0, 00:10:16.012 "data_size": 0 00:10:16.012 } 00:10:16.012 ] 00:10:16.012 }' 00:10:16.012 17:22:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:16.012 17:22:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:16.582 17:22:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:16.582 [2024-07-15 17:22:27.807556] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:16.582 [2024-07-15 17:22:27.807572] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b0f6b0 name Existed_Raid, state configuring 00:10:16.582 17:22:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:16.842 [2024-07-15 17:22:27.996047] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:16.842 [2024-07-15 17:22:27.996068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:16.842 [2024-07-15 17:22:27.996073] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:16.842 [2024-07-15 17:22:27.996079] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:16.842 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:17.102 [2024-07-15 17:22:28.179202] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:17.102 BaseBdev1 00:10:17.102 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:17.102 17:22:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:17.102 17:22:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:17.102 17:22:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:17.102 17:22:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:17.102 17:22:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:17.102 17:22:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:17.102 17:22:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:17.362 [ 00:10:17.362 { 00:10:17.362 "name": "BaseBdev1", 00:10:17.362 "aliases": [ 00:10:17.362 "fae53304-d9cf-4c1a-84a5-8b188c937f97" 00:10:17.362 ], 00:10:17.362 "product_name": "Malloc disk", 00:10:17.362 "block_size": 512, 00:10:17.362 "num_blocks": 65536, 00:10:17.362 "uuid": "fae53304-d9cf-4c1a-84a5-8b188c937f97", 00:10:17.362 "assigned_rate_limits": { 00:10:17.362 "rw_ios_per_sec": 0, 00:10:17.362 "rw_mbytes_per_sec": 0, 00:10:17.362 "r_mbytes_per_sec": 0, 00:10:17.362 "w_mbytes_per_sec": 0 00:10:17.362 }, 00:10:17.362 "claimed": true, 00:10:17.362 "claim_type": "exclusive_write", 00:10:17.362 "zoned": false, 00:10:17.362 "supported_io_types": { 00:10:17.362 "read": true, 00:10:17.362 "write": true, 00:10:17.362 "unmap": true, 00:10:17.362 "flush": true, 00:10:17.362 "reset": true, 00:10:17.362 "nvme_admin": false, 00:10:17.362 "nvme_io": false, 00:10:17.362 "nvme_io_md": false, 00:10:17.362 "write_zeroes": true, 00:10:17.362 "zcopy": true, 00:10:17.362 "get_zone_info": false, 00:10:17.362 "zone_management": false, 00:10:17.362 "zone_append": false, 00:10:17.362 "compare": false, 00:10:17.362 "compare_and_write": false, 00:10:17.362 "abort": true, 00:10:17.362 "seek_hole": false, 00:10:17.362 "seek_data": false, 00:10:17.362 "copy": true, 00:10:17.362 "nvme_iov_md": false 00:10:17.362 }, 00:10:17.362 "memory_domains": [ 00:10:17.362 { 00:10:17.362 "dma_device_id": "system", 00:10:17.362 "dma_device_type": 1 00:10:17.362 }, 00:10:17.362 { 00:10:17.362 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:17.362 "dma_device_type": 2 00:10:17.362 } 00:10:17.362 ], 00:10:17.362 "driver_specific": {} 00:10:17.362 } 00:10:17.362 ] 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.362 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:17.622 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:17.622 "name": "Existed_Raid", 00:10:17.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:17.622 "strip_size_kb": 64, 00:10:17.622 "state": "configuring", 00:10:17.622 "raid_level": "raid0", 00:10:17.622 "superblock": false, 00:10:17.622 "num_base_bdevs": 2, 00:10:17.622 "num_base_bdevs_discovered": 1, 00:10:17.622 "num_base_bdevs_operational": 2, 00:10:17.622 "base_bdevs_list": [ 00:10:17.622 { 00:10:17.622 "name": "BaseBdev1", 00:10:17.622 "uuid": "fae53304-d9cf-4c1a-84a5-8b188c937f97", 00:10:17.622 "is_configured": true, 00:10:17.622 "data_offset": 0, 00:10:17.622 "data_size": 65536 00:10:17.622 }, 00:10:17.622 { 00:10:17.622 "name": "BaseBdev2", 00:10:17.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:17.622 "is_configured": false, 00:10:17.622 "data_offset": 0, 00:10:17.622 "data_size": 0 00:10:17.622 } 00:10:17.622 ] 00:10:17.622 }' 00:10:17.622 17:22:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:17.622 17:22:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:18.193 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:18.193 [2024-07-15 17:22:29.466449] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:18.193 [2024-07-15 17:22:29.466476] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b0efa0 name Existed_Raid, state configuring 00:10:18.193 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:18.452 [2024-07-15 17:22:29.650942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:18.452 [2024-07-15 17:22:29.652075] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:18.452 [2024-07-15 17:22:29.652098] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:18.452 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:18.452 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:18.452 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:18.453 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:18.712 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:18.712 "name": "Existed_Raid", 00:10:18.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:18.712 "strip_size_kb": 64, 00:10:18.712 "state": "configuring", 00:10:18.712 "raid_level": "raid0", 00:10:18.712 "superblock": false, 00:10:18.712 "num_base_bdevs": 2, 00:10:18.712 "num_base_bdevs_discovered": 1, 00:10:18.712 "num_base_bdevs_operational": 2, 00:10:18.712 "base_bdevs_list": [ 00:10:18.712 { 00:10:18.712 "name": "BaseBdev1", 00:10:18.712 "uuid": "fae53304-d9cf-4c1a-84a5-8b188c937f97", 00:10:18.712 "is_configured": true, 00:10:18.712 "data_offset": 0, 00:10:18.712 "data_size": 65536 00:10:18.712 }, 00:10:18.712 { 00:10:18.712 "name": "BaseBdev2", 00:10:18.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:18.712 "is_configured": false, 00:10:18.712 "data_offset": 0, 00:10:18.712 "data_size": 0 00:10:18.712 } 00:10:18.712 ] 00:10:18.712 }' 00:10:18.712 17:22:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:18.712 17:22:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:19.282 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:19.542 [2024-07-15 17:22:30.590005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:19.542 [2024-07-15 17:22:30.590032] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b0fd90 00:10:19.542 [2024-07-15 17:22:30.590037] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:19.542 [2024-07-15 17:22:30.590181] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cb3730 00:10:19.542 [2024-07-15 17:22:30.590270] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b0fd90 00:10:19.542 [2024-07-15 17:22:30.590275] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b0fd90 00:10:19.542 [2024-07-15 17:22:30.590400] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:19.542 BaseBdev2 00:10:19.542 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:19.542 17:22:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:19.542 17:22:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:19.542 17:22:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:19.542 17:22:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:19.542 17:22:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:19.542 17:22:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:19.542 17:22:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:19.802 [ 00:10:19.802 { 00:10:19.802 "name": "BaseBdev2", 00:10:19.802 "aliases": [ 00:10:19.802 "06f871d9-5bec-4010-afb1-efa8e9ad7d37" 00:10:19.802 ], 00:10:19.802 "product_name": "Malloc disk", 00:10:19.802 "block_size": 512, 00:10:19.802 "num_blocks": 65536, 00:10:19.802 "uuid": "06f871d9-5bec-4010-afb1-efa8e9ad7d37", 00:10:19.802 "assigned_rate_limits": { 00:10:19.802 "rw_ios_per_sec": 0, 00:10:19.802 "rw_mbytes_per_sec": 0, 00:10:19.802 "r_mbytes_per_sec": 0, 00:10:19.802 "w_mbytes_per_sec": 0 00:10:19.802 }, 00:10:19.802 "claimed": true, 00:10:19.802 "claim_type": "exclusive_write", 00:10:19.802 "zoned": false, 00:10:19.802 "supported_io_types": { 00:10:19.802 "read": true, 00:10:19.802 "write": true, 00:10:19.802 "unmap": true, 00:10:19.802 "flush": true, 00:10:19.802 "reset": true, 00:10:19.802 "nvme_admin": false, 00:10:19.802 "nvme_io": false, 00:10:19.802 "nvme_io_md": false, 00:10:19.802 "write_zeroes": true, 00:10:19.802 "zcopy": true, 00:10:19.802 "get_zone_info": false, 00:10:19.802 "zone_management": false, 00:10:19.802 "zone_append": false, 00:10:19.802 "compare": false, 00:10:19.802 "compare_and_write": false, 00:10:19.802 "abort": true, 00:10:19.802 "seek_hole": false, 00:10:19.802 "seek_data": false, 00:10:19.802 "copy": true, 00:10:19.802 "nvme_iov_md": false 00:10:19.802 }, 00:10:19.802 "memory_domains": [ 00:10:19.802 { 00:10:19.802 "dma_device_id": "system", 00:10:19.802 "dma_device_type": 1 00:10:19.802 }, 00:10:19.802 { 00:10:19.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:19.802 "dma_device_type": 2 00:10:19.802 } 00:10:19.802 ], 00:10:19.802 "driver_specific": {} 00:10:19.802 } 00:10:19.802 ] 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:19.802 17:22:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:20.062 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:20.062 "name": "Existed_Raid", 00:10:20.062 "uuid": "ae443048-5c75-4e27-94cb-02704310356d", 00:10:20.062 "strip_size_kb": 64, 00:10:20.062 "state": "online", 00:10:20.062 "raid_level": "raid0", 00:10:20.062 "superblock": false, 00:10:20.062 "num_base_bdevs": 2, 00:10:20.062 "num_base_bdevs_discovered": 2, 00:10:20.062 "num_base_bdevs_operational": 2, 00:10:20.062 "base_bdevs_list": [ 00:10:20.062 { 00:10:20.062 "name": "BaseBdev1", 00:10:20.062 "uuid": "fae53304-d9cf-4c1a-84a5-8b188c937f97", 00:10:20.062 "is_configured": true, 00:10:20.062 "data_offset": 0, 00:10:20.062 "data_size": 65536 00:10:20.062 }, 00:10:20.062 { 00:10:20.062 "name": "BaseBdev2", 00:10:20.062 "uuid": "06f871d9-5bec-4010-afb1-efa8e9ad7d37", 00:10:20.062 "is_configured": true, 00:10:20.062 "data_offset": 0, 00:10:20.062 "data_size": 65536 00:10:20.062 } 00:10:20.062 ] 00:10:20.062 }' 00:10:20.062 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:20.062 17:22:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:20.632 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:20.632 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:20.632 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:20.632 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:20.632 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:20.632 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:20.632 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:20.632 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:20.632 [2024-07-15 17:22:31.913585] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:20.891 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:20.891 "name": "Existed_Raid", 00:10:20.892 "aliases": [ 00:10:20.892 "ae443048-5c75-4e27-94cb-02704310356d" 00:10:20.892 ], 00:10:20.892 "product_name": "Raid Volume", 00:10:20.892 "block_size": 512, 00:10:20.892 "num_blocks": 131072, 00:10:20.892 "uuid": "ae443048-5c75-4e27-94cb-02704310356d", 00:10:20.892 "assigned_rate_limits": { 00:10:20.892 "rw_ios_per_sec": 0, 00:10:20.892 "rw_mbytes_per_sec": 0, 00:10:20.892 "r_mbytes_per_sec": 0, 00:10:20.892 "w_mbytes_per_sec": 0 00:10:20.892 }, 00:10:20.892 "claimed": false, 00:10:20.892 "zoned": false, 00:10:20.892 "supported_io_types": { 00:10:20.892 "read": true, 00:10:20.892 "write": true, 00:10:20.892 "unmap": true, 00:10:20.892 "flush": true, 00:10:20.892 "reset": true, 00:10:20.892 "nvme_admin": false, 00:10:20.892 "nvme_io": false, 00:10:20.892 "nvme_io_md": false, 00:10:20.892 "write_zeroes": true, 00:10:20.892 "zcopy": false, 00:10:20.892 "get_zone_info": false, 00:10:20.892 "zone_management": false, 00:10:20.892 "zone_append": false, 00:10:20.892 "compare": false, 00:10:20.892 "compare_and_write": false, 00:10:20.892 "abort": false, 00:10:20.892 "seek_hole": false, 00:10:20.892 "seek_data": false, 00:10:20.892 "copy": false, 00:10:20.892 "nvme_iov_md": false 00:10:20.892 }, 00:10:20.892 "memory_domains": [ 00:10:20.892 { 00:10:20.892 "dma_device_id": "system", 00:10:20.892 "dma_device_type": 1 00:10:20.892 }, 00:10:20.892 { 00:10:20.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:20.892 "dma_device_type": 2 00:10:20.892 }, 00:10:20.892 { 00:10:20.892 "dma_device_id": "system", 00:10:20.892 "dma_device_type": 1 00:10:20.892 }, 00:10:20.892 { 00:10:20.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:20.892 "dma_device_type": 2 00:10:20.892 } 00:10:20.892 ], 00:10:20.892 "driver_specific": { 00:10:20.892 "raid": { 00:10:20.892 "uuid": "ae443048-5c75-4e27-94cb-02704310356d", 00:10:20.892 "strip_size_kb": 64, 00:10:20.892 "state": "online", 00:10:20.892 "raid_level": "raid0", 00:10:20.892 "superblock": false, 00:10:20.892 "num_base_bdevs": 2, 00:10:20.892 "num_base_bdevs_discovered": 2, 00:10:20.892 "num_base_bdevs_operational": 2, 00:10:20.892 "base_bdevs_list": [ 00:10:20.892 { 00:10:20.892 "name": "BaseBdev1", 00:10:20.892 "uuid": "fae53304-d9cf-4c1a-84a5-8b188c937f97", 00:10:20.892 "is_configured": true, 00:10:20.892 "data_offset": 0, 00:10:20.892 "data_size": 65536 00:10:20.892 }, 00:10:20.892 { 00:10:20.892 "name": "BaseBdev2", 00:10:20.892 "uuid": "06f871d9-5bec-4010-afb1-efa8e9ad7d37", 00:10:20.892 "is_configured": true, 00:10:20.892 "data_offset": 0, 00:10:20.892 "data_size": 65536 00:10:20.892 } 00:10:20.892 ] 00:10:20.892 } 00:10:20.892 } 00:10:20.892 }' 00:10:20.892 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:20.892 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:20.892 BaseBdev2' 00:10:20.892 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:20.892 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:20.892 17:22:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:20.892 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:20.892 "name": "BaseBdev1", 00:10:20.892 "aliases": [ 00:10:20.892 "fae53304-d9cf-4c1a-84a5-8b188c937f97" 00:10:20.892 ], 00:10:20.892 "product_name": "Malloc disk", 00:10:20.892 "block_size": 512, 00:10:20.892 "num_blocks": 65536, 00:10:20.892 "uuid": "fae53304-d9cf-4c1a-84a5-8b188c937f97", 00:10:20.892 "assigned_rate_limits": { 00:10:20.892 "rw_ios_per_sec": 0, 00:10:20.892 "rw_mbytes_per_sec": 0, 00:10:20.892 "r_mbytes_per_sec": 0, 00:10:20.892 "w_mbytes_per_sec": 0 00:10:20.892 }, 00:10:20.892 "claimed": true, 00:10:20.892 "claim_type": "exclusive_write", 00:10:20.892 "zoned": false, 00:10:20.892 "supported_io_types": { 00:10:20.892 "read": true, 00:10:20.892 "write": true, 00:10:20.892 "unmap": true, 00:10:20.892 "flush": true, 00:10:20.892 "reset": true, 00:10:20.892 "nvme_admin": false, 00:10:20.892 "nvme_io": false, 00:10:20.892 "nvme_io_md": false, 00:10:20.892 "write_zeroes": true, 00:10:20.892 "zcopy": true, 00:10:20.892 "get_zone_info": false, 00:10:20.892 "zone_management": false, 00:10:20.892 "zone_append": false, 00:10:20.892 "compare": false, 00:10:20.892 "compare_and_write": false, 00:10:20.892 "abort": true, 00:10:20.892 "seek_hole": false, 00:10:20.892 "seek_data": false, 00:10:20.892 "copy": true, 00:10:20.892 "nvme_iov_md": false 00:10:20.892 }, 00:10:20.892 "memory_domains": [ 00:10:20.892 { 00:10:20.892 "dma_device_id": "system", 00:10:20.892 "dma_device_type": 1 00:10:20.892 }, 00:10:20.892 { 00:10:20.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:20.892 "dma_device_type": 2 00:10:20.892 } 00:10:20.892 ], 00:10:20.892 "driver_specific": {} 00:10:20.892 }' 00:10:20.892 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.151 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.151 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:21.151 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.151 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.151 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:21.151 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.151 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.151 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:21.151 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.411 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.411 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:21.411 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:21.411 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:21.411 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:21.411 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:21.411 "name": "BaseBdev2", 00:10:21.411 "aliases": [ 00:10:21.411 "06f871d9-5bec-4010-afb1-efa8e9ad7d37" 00:10:21.411 ], 00:10:21.411 "product_name": "Malloc disk", 00:10:21.411 "block_size": 512, 00:10:21.411 "num_blocks": 65536, 00:10:21.411 "uuid": "06f871d9-5bec-4010-afb1-efa8e9ad7d37", 00:10:21.411 "assigned_rate_limits": { 00:10:21.411 "rw_ios_per_sec": 0, 00:10:21.411 "rw_mbytes_per_sec": 0, 00:10:21.411 "r_mbytes_per_sec": 0, 00:10:21.411 "w_mbytes_per_sec": 0 00:10:21.411 }, 00:10:21.411 "claimed": true, 00:10:21.411 "claim_type": "exclusive_write", 00:10:21.411 "zoned": false, 00:10:21.411 "supported_io_types": { 00:10:21.411 "read": true, 00:10:21.411 "write": true, 00:10:21.411 "unmap": true, 00:10:21.411 "flush": true, 00:10:21.411 "reset": true, 00:10:21.411 "nvme_admin": false, 00:10:21.411 "nvme_io": false, 00:10:21.411 "nvme_io_md": false, 00:10:21.411 "write_zeroes": true, 00:10:21.411 "zcopy": true, 00:10:21.411 "get_zone_info": false, 00:10:21.411 "zone_management": false, 00:10:21.411 "zone_append": false, 00:10:21.411 "compare": false, 00:10:21.411 "compare_and_write": false, 00:10:21.411 "abort": true, 00:10:21.411 "seek_hole": false, 00:10:21.411 "seek_data": false, 00:10:21.411 "copy": true, 00:10:21.411 "nvme_iov_md": false 00:10:21.411 }, 00:10:21.411 "memory_domains": [ 00:10:21.411 { 00:10:21.411 "dma_device_id": "system", 00:10:21.411 "dma_device_type": 1 00:10:21.411 }, 00:10:21.411 { 00:10:21.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.411 "dma_device_type": 2 00:10:21.411 } 00:10:21.411 ], 00:10:21.411 "driver_specific": {} 00:10:21.411 }' 00:10:21.411 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.697 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.697 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:21.697 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.697 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.697 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:21.697 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.697 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.697 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:21.697 17:22:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.956 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.956 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:21.956 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:21.956 [2024-07-15 17:22:33.224726] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:21.957 [2024-07-15 17:22:33.224744] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:21.957 [2024-07-15 17:22:33.224772] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:21.957 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:22.217 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:22.217 "name": "Existed_Raid", 00:10:22.217 "uuid": "ae443048-5c75-4e27-94cb-02704310356d", 00:10:22.217 "strip_size_kb": 64, 00:10:22.217 "state": "offline", 00:10:22.217 "raid_level": "raid0", 00:10:22.217 "superblock": false, 00:10:22.217 "num_base_bdevs": 2, 00:10:22.217 "num_base_bdevs_discovered": 1, 00:10:22.217 "num_base_bdevs_operational": 1, 00:10:22.217 "base_bdevs_list": [ 00:10:22.217 { 00:10:22.217 "name": null, 00:10:22.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:22.217 "is_configured": false, 00:10:22.217 "data_offset": 0, 00:10:22.217 "data_size": 65536 00:10:22.217 }, 00:10:22.217 { 00:10:22.217 "name": "BaseBdev2", 00:10:22.217 "uuid": "06f871d9-5bec-4010-afb1-efa8e9ad7d37", 00:10:22.217 "is_configured": true, 00:10:22.217 "data_offset": 0, 00:10:22.217 "data_size": 65536 00:10:22.217 } 00:10:22.217 ] 00:10:22.217 }' 00:10:22.217 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:22.217 17:22:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:22.784 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:22.784 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:22.784 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:22.784 17:22:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.043 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:23.043 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:23.043 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:23.302 [2024-07-15 17:22:34.347565] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:23.302 [2024-07-15 17:22:34.347603] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b0fd90 name Existed_Raid, state offline 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2739967 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2739967 ']' 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2739967 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:23.302 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2739967 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2739967' 00:10:23.562 killing process with pid 2739967 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2739967 00:10:23.562 [2024-07-15 17:22:34.609722] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2739967 00:10:23.562 [2024-07-15 17:22:34.610333] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:23.562 00:10:23.562 real 0m8.887s 00:10:23.562 user 0m16.172s 00:10:23.562 sys 0m1.329s 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:23.562 ************************************ 00:10:23.562 END TEST raid_state_function_test 00:10:23.562 ************************************ 00:10:23.562 17:22:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:23.562 17:22:34 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:23.562 17:22:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:23.562 17:22:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:23.562 17:22:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:23.562 ************************************ 00:10:23.562 START TEST raid_state_function_test_sb 00:10:23.562 ************************************ 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:23.562 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2741676 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2741676' 00:10:23.563 Process raid pid: 2741676 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2741676 /var/tmp/spdk-raid.sock 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2741676 ']' 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:23.563 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:23.563 17:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:23.822 [2024-07-15 17:22:34.863852] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:23.822 [2024-07-15 17:22:34.863898] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:23.822 [2024-07-15 17:22:34.953437] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:23.822 [2024-07-15 17:22:35.018944] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:23.822 [2024-07-15 17:22:35.070796] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:23.822 [2024-07-15 17:22:35.070820] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:24.760 17:22:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:24.760 17:22:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:24.761 [2024-07-15 17:22:35.878579] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:24.761 [2024-07-15 17:22:35.878607] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:24.761 [2024-07-15 17:22:35.878613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:24.761 [2024-07-15 17:22:35.878619] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:24.761 17:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:25.021 17:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:25.021 "name": "Existed_Raid", 00:10:25.021 "uuid": "d6c1f005-21b6-4655-b172-f5b9654c9501", 00:10:25.021 "strip_size_kb": 64, 00:10:25.021 "state": "configuring", 00:10:25.021 "raid_level": "raid0", 00:10:25.021 "superblock": true, 00:10:25.021 "num_base_bdevs": 2, 00:10:25.021 "num_base_bdevs_discovered": 0, 00:10:25.021 "num_base_bdevs_operational": 2, 00:10:25.021 "base_bdevs_list": [ 00:10:25.021 { 00:10:25.021 "name": "BaseBdev1", 00:10:25.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:25.021 "is_configured": false, 00:10:25.021 "data_offset": 0, 00:10:25.021 "data_size": 0 00:10:25.021 }, 00:10:25.021 { 00:10:25.021 "name": "BaseBdev2", 00:10:25.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:25.021 "is_configured": false, 00:10:25.021 "data_offset": 0, 00:10:25.021 "data_size": 0 00:10:25.021 } 00:10:25.021 ] 00:10:25.021 }' 00:10:25.021 17:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:25.021 17:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:25.591 17:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:25.591 [2024-07-15 17:22:36.816829] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:25.591 [2024-07-15 17:22:36.816846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a66b0 name Existed_Raid, state configuring 00:10:25.591 17:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:25.851 [2024-07-15 17:22:36.993303] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:25.851 [2024-07-15 17:22:36.993322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:25.851 [2024-07-15 17:22:36.993327] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:25.851 [2024-07-15 17:22:36.993333] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:25.851 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:26.111 [2024-07-15 17:22:37.188460] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:26.111 BaseBdev1 00:10:26.111 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:26.111 17:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:26.111 17:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:26.111 17:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:26.111 17:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:26.111 17:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:26.111 17:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:26.111 17:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:26.371 [ 00:10:26.371 { 00:10:26.371 "name": "BaseBdev1", 00:10:26.371 "aliases": [ 00:10:26.371 "a073d201-c4b9-4b35-8328-52d31ee321e9" 00:10:26.371 ], 00:10:26.371 "product_name": "Malloc disk", 00:10:26.371 "block_size": 512, 00:10:26.371 "num_blocks": 65536, 00:10:26.371 "uuid": "a073d201-c4b9-4b35-8328-52d31ee321e9", 00:10:26.371 "assigned_rate_limits": { 00:10:26.371 "rw_ios_per_sec": 0, 00:10:26.371 "rw_mbytes_per_sec": 0, 00:10:26.371 "r_mbytes_per_sec": 0, 00:10:26.371 "w_mbytes_per_sec": 0 00:10:26.371 }, 00:10:26.371 "claimed": true, 00:10:26.371 "claim_type": "exclusive_write", 00:10:26.371 "zoned": false, 00:10:26.371 "supported_io_types": { 00:10:26.371 "read": true, 00:10:26.371 "write": true, 00:10:26.371 "unmap": true, 00:10:26.371 "flush": true, 00:10:26.371 "reset": true, 00:10:26.371 "nvme_admin": false, 00:10:26.371 "nvme_io": false, 00:10:26.371 "nvme_io_md": false, 00:10:26.371 "write_zeroes": true, 00:10:26.371 "zcopy": true, 00:10:26.371 "get_zone_info": false, 00:10:26.371 "zone_management": false, 00:10:26.371 "zone_append": false, 00:10:26.371 "compare": false, 00:10:26.371 "compare_and_write": false, 00:10:26.371 "abort": true, 00:10:26.371 "seek_hole": false, 00:10:26.371 "seek_data": false, 00:10:26.371 "copy": true, 00:10:26.371 "nvme_iov_md": false 00:10:26.371 }, 00:10:26.371 "memory_domains": [ 00:10:26.371 { 00:10:26.371 "dma_device_id": "system", 00:10:26.371 "dma_device_type": 1 00:10:26.371 }, 00:10:26.371 { 00:10:26.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.371 "dma_device_type": 2 00:10:26.371 } 00:10:26.371 ], 00:10:26.371 "driver_specific": {} 00:10:26.371 } 00:10:26.371 ] 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:26.371 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:26.630 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:26.630 "name": "Existed_Raid", 00:10:26.630 "uuid": "36b80768-c171-4162-920e-c9db7ae418d4", 00:10:26.630 "strip_size_kb": 64, 00:10:26.630 "state": "configuring", 00:10:26.630 "raid_level": "raid0", 00:10:26.630 "superblock": true, 00:10:26.630 "num_base_bdevs": 2, 00:10:26.630 "num_base_bdevs_discovered": 1, 00:10:26.630 "num_base_bdevs_operational": 2, 00:10:26.630 "base_bdevs_list": [ 00:10:26.630 { 00:10:26.630 "name": "BaseBdev1", 00:10:26.630 "uuid": "a073d201-c4b9-4b35-8328-52d31ee321e9", 00:10:26.630 "is_configured": true, 00:10:26.630 "data_offset": 2048, 00:10:26.630 "data_size": 63488 00:10:26.630 }, 00:10:26.630 { 00:10:26.630 "name": "BaseBdev2", 00:10:26.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:26.630 "is_configured": false, 00:10:26.630 "data_offset": 0, 00:10:26.630 "data_size": 0 00:10:26.630 } 00:10:26.630 ] 00:10:26.630 }' 00:10:26.630 17:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:26.630 17:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:27.200 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:27.200 [2024-07-15 17:22:38.483741] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:27.200 [2024-07-15 17:22:38.483767] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a5fa0 name Existed_Raid, state configuring 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:27.459 [2024-07-15 17:22:38.668234] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:27.459 [2024-07-15 17:22:38.669359] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:27.459 [2024-07-15 17:22:38.669382] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:27.459 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:27.718 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:27.718 "name": "Existed_Raid", 00:10:27.718 "uuid": "2e74d68a-6291-4668-8cae-aa85a0318d1a", 00:10:27.718 "strip_size_kb": 64, 00:10:27.718 "state": "configuring", 00:10:27.718 "raid_level": "raid0", 00:10:27.718 "superblock": true, 00:10:27.718 "num_base_bdevs": 2, 00:10:27.718 "num_base_bdevs_discovered": 1, 00:10:27.718 "num_base_bdevs_operational": 2, 00:10:27.718 "base_bdevs_list": [ 00:10:27.718 { 00:10:27.718 "name": "BaseBdev1", 00:10:27.718 "uuid": "a073d201-c4b9-4b35-8328-52d31ee321e9", 00:10:27.718 "is_configured": true, 00:10:27.718 "data_offset": 2048, 00:10:27.718 "data_size": 63488 00:10:27.718 }, 00:10:27.718 { 00:10:27.718 "name": "BaseBdev2", 00:10:27.718 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:27.718 "is_configured": false, 00:10:27.718 "data_offset": 0, 00:10:27.718 "data_size": 0 00:10:27.718 } 00:10:27.718 ] 00:10:27.718 }' 00:10:27.718 17:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:27.718 17:22:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:28.287 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:28.548 [2024-07-15 17:22:39.595329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:28.548 [2024-07-15 17:22:39.595435] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9a6d90 00:10:28.548 [2024-07-15 17:22:39.595443] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:28.548 [2024-07-15 17:22:39.595578] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb5a750 00:10:28.548 [2024-07-15 17:22:39.595666] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9a6d90 00:10:28.548 [2024-07-15 17:22:39.595672] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9a6d90 00:10:28.548 [2024-07-15 17:22:39.595746] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:28.548 BaseBdev2 00:10:28.548 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:28.548 17:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:28.548 17:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:28.548 17:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:28.548 17:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:28.548 17:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:28.548 17:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:28.548 17:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:28.809 [ 00:10:28.809 { 00:10:28.809 "name": "BaseBdev2", 00:10:28.809 "aliases": [ 00:10:28.809 "7b6d5a46-ad83-405e-9a95-d93124299157" 00:10:28.809 ], 00:10:28.809 "product_name": "Malloc disk", 00:10:28.809 "block_size": 512, 00:10:28.809 "num_blocks": 65536, 00:10:28.809 "uuid": "7b6d5a46-ad83-405e-9a95-d93124299157", 00:10:28.809 "assigned_rate_limits": { 00:10:28.809 "rw_ios_per_sec": 0, 00:10:28.809 "rw_mbytes_per_sec": 0, 00:10:28.809 "r_mbytes_per_sec": 0, 00:10:28.809 "w_mbytes_per_sec": 0 00:10:28.809 }, 00:10:28.809 "claimed": true, 00:10:28.809 "claim_type": "exclusive_write", 00:10:28.809 "zoned": false, 00:10:28.809 "supported_io_types": { 00:10:28.809 "read": true, 00:10:28.809 "write": true, 00:10:28.809 "unmap": true, 00:10:28.809 "flush": true, 00:10:28.809 "reset": true, 00:10:28.809 "nvme_admin": false, 00:10:28.809 "nvme_io": false, 00:10:28.809 "nvme_io_md": false, 00:10:28.809 "write_zeroes": true, 00:10:28.809 "zcopy": true, 00:10:28.809 "get_zone_info": false, 00:10:28.809 "zone_management": false, 00:10:28.809 "zone_append": false, 00:10:28.809 "compare": false, 00:10:28.809 "compare_and_write": false, 00:10:28.809 "abort": true, 00:10:28.809 "seek_hole": false, 00:10:28.809 "seek_data": false, 00:10:28.809 "copy": true, 00:10:28.809 "nvme_iov_md": false 00:10:28.809 }, 00:10:28.809 "memory_domains": [ 00:10:28.809 { 00:10:28.809 "dma_device_id": "system", 00:10:28.809 "dma_device_type": 1 00:10:28.809 }, 00:10:28.809 { 00:10:28.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:28.809 "dma_device_type": 2 00:10:28.809 } 00:10:28.809 ], 00:10:28.809 "driver_specific": {} 00:10:28.809 } 00:10:28.809 ] 00:10:28.809 17:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:28.809 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:28.809 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:28.809 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:28.809 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:28.809 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:28.809 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:28.809 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:28.809 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:28.810 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:28.810 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:28.810 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:28.810 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:28.810 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.810 17:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:29.070 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:29.070 "name": "Existed_Raid", 00:10:29.070 "uuid": "2e74d68a-6291-4668-8cae-aa85a0318d1a", 00:10:29.070 "strip_size_kb": 64, 00:10:29.070 "state": "online", 00:10:29.070 "raid_level": "raid0", 00:10:29.070 "superblock": true, 00:10:29.070 "num_base_bdevs": 2, 00:10:29.070 "num_base_bdevs_discovered": 2, 00:10:29.070 "num_base_bdevs_operational": 2, 00:10:29.070 "base_bdevs_list": [ 00:10:29.070 { 00:10:29.070 "name": "BaseBdev1", 00:10:29.070 "uuid": "a073d201-c4b9-4b35-8328-52d31ee321e9", 00:10:29.070 "is_configured": true, 00:10:29.070 "data_offset": 2048, 00:10:29.070 "data_size": 63488 00:10:29.070 }, 00:10:29.070 { 00:10:29.070 "name": "BaseBdev2", 00:10:29.070 "uuid": "7b6d5a46-ad83-405e-9a95-d93124299157", 00:10:29.070 "is_configured": true, 00:10:29.070 "data_offset": 2048, 00:10:29.070 "data_size": 63488 00:10:29.070 } 00:10:29.070 ] 00:10:29.070 }' 00:10:29.070 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:29.070 17:22:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:29.640 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:29.640 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:29.640 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:29.640 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:29.640 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:29.640 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:29.640 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:29.640 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:29.640 [2024-07-15 17:22:40.922904] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:29.902 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:29.902 "name": "Existed_Raid", 00:10:29.902 "aliases": [ 00:10:29.902 "2e74d68a-6291-4668-8cae-aa85a0318d1a" 00:10:29.902 ], 00:10:29.902 "product_name": "Raid Volume", 00:10:29.902 "block_size": 512, 00:10:29.902 "num_blocks": 126976, 00:10:29.902 "uuid": "2e74d68a-6291-4668-8cae-aa85a0318d1a", 00:10:29.902 "assigned_rate_limits": { 00:10:29.902 "rw_ios_per_sec": 0, 00:10:29.902 "rw_mbytes_per_sec": 0, 00:10:29.902 "r_mbytes_per_sec": 0, 00:10:29.903 "w_mbytes_per_sec": 0 00:10:29.903 }, 00:10:29.903 "claimed": false, 00:10:29.903 "zoned": false, 00:10:29.903 "supported_io_types": { 00:10:29.903 "read": true, 00:10:29.903 "write": true, 00:10:29.903 "unmap": true, 00:10:29.903 "flush": true, 00:10:29.903 "reset": true, 00:10:29.903 "nvme_admin": false, 00:10:29.903 "nvme_io": false, 00:10:29.903 "nvme_io_md": false, 00:10:29.903 "write_zeroes": true, 00:10:29.903 "zcopy": false, 00:10:29.903 "get_zone_info": false, 00:10:29.903 "zone_management": false, 00:10:29.903 "zone_append": false, 00:10:29.903 "compare": false, 00:10:29.903 "compare_and_write": false, 00:10:29.903 "abort": false, 00:10:29.903 "seek_hole": false, 00:10:29.903 "seek_data": false, 00:10:29.903 "copy": false, 00:10:29.903 "nvme_iov_md": false 00:10:29.903 }, 00:10:29.903 "memory_domains": [ 00:10:29.903 { 00:10:29.903 "dma_device_id": "system", 00:10:29.903 "dma_device_type": 1 00:10:29.903 }, 00:10:29.903 { 00:10:29.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:29.903 "dma_device_type": 2 00:10:29.903 }, 00:10:29.903 { 00:10:29.903 "dma_device_id": "system", 00:10:29.903 "dma_device_type": 1 00:10:29.903 }, 00:10:29.903 { 00:10:29.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:29.903 "dma_device_type": 2 00:10:29.903 } 00:10:29.903 ], 00:10:29.903 "driver_specific": { 00:10:29.903 "raid": { 00:10:29.903 "uuid": "2e74d68a-6291-4668-8cae-aa85a0318d1a", 00:10:29.903 "strip_size_kb": 64, 00:10:29.903 "state": "online", 00:10:29.903 "raid_level": "raid0", 00:10:29.903 "superblock": true, 00:10:29.903 "num_base_bdevs": 2, 00:10:29.903 "num_base_bdevs_discovered": 2, 00:10:29.903 "num_base_bdevs_operational": 2, 00:10:29.903 "base_bdevs_list": [ 00:10:29.903 { 00:10:29.903 "name": "BaseBdev1", 00:10:29.903 "uuid": "a073d201-c4b9-4b35-8328-52d31ee321e9", 00:10:29.903 "is_configured": true, 00:10:29.903 "data_offset": 2048, 00:10:29.903 "data_size": 63488 00:10:29.903 }, 00:10:29.903 { 00:10:29.903 "name": "BaseBdev2", 00:10:29.903 "uuid": "7b6d5a46-ad83-405e-9a95-d93124299157", 00:10:29.903 "is_configured": true, 00:10:29.903 "data_offset": 2048, 00:10:29.903 "data_size": 63488 00:10:29.903 } 00:10:29.903 ] 00:10:29.903 } 00:10:29.903 } 00:10:29.903 }' 00:10:29.903 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:29.903 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:29.903 BaseBdev2' 00:10:29.903 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:29.903 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:29.903 17:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:29.903 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:29.903 "name": "BaseBdev1", 00:10:29.903 "aliases": [ 00:10:29.903 "a073d201-c4b9-4b35-8328-52d31ee321e9" 00:10:29.903 ], 00:10:29.903 "product_name": "Malloc disk", 00:10:29.903 "block_size": 512, 00:10:29.903 "num_blocks": 65536, 00:10:29.903 "uuid": "a073d201-c4b9-4b35-8328-52d31ee321e9", 00:10:29.903 "assigned_rate_limits": { 00:10:29.903 "rw_ios_per_sec": 0, 00:10:29.903 "rw_mbytes_per_sec": 0, 00:10:29.903 "r_mbytes_per_sec": 0, 00:10:29.903 "w_mbytes_per_sec": 0 00:10:29.903 }, 00:10:29.903 "claimed": true, 00:10:29.903 "claim_type": "exclusive_write", 00:10:29.903 "zoned": false, 00:10:29.903 "supported_io_types": { 00:10:29.903 "read": true, 00:10:29.903 "write": true, 00:10:29.903 "unmap": true, 00:10:29.903 "flush": true, 00:10:29.903 "reset": true, 00:10:29.903 "nvme_admin": false, 00:10:29.903 "nvme_io": false, 00:10:29.903 "nvme_io_md": false, 00:10:29.903 "write_zeroes": true, 00:10:29.903 "zcopy": true, 00:10:29.903 "get_zone_info": false, 00:10:29.903 "zone_management": false, 00:10:29.903 "zone_append": false, 00:10:29.903 "compare": false, 00:10:29.903 "compare_and_write": false, 00:10:29.903 "abort": true, 00:10:29.903 "seek_hole": false, 00:10:29.903 "seek_data": false, 00:10:29.903 "copy": true, 00:10:29.903 "nvme_iov_md": false 00:10:29.903 }, 00:10:29.903 "memory_domains": [ 00:10:29.903 { 00:10:29.903 "dma_device_id": "system", 00:10:29.903 "dma_device_type": 1 00:10:29.903 }, 00:10:29.903 { 00:10:29.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:29.903 "dma_device_type": 2 00:10:29.903 } 00:10:29.903 ], 00:10:29.903 "driver_specific": {} 00:10:29.903 }' 00:10:29.903 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:30.201 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:30.201 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:30.201 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:30.201 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:30.201 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:30.201 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:30.201 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:30.201 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:30.201 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:30.465 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:30.465 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:30.465 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:30.465 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:30.465 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:30.465 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:30.465 "name": "BaseBdev2", 00:10:30.465 "aliases": [ 00:10:30.465 "7b6d5a46-ad83-405e-9a95-d93124299157" 00:10:30.465 ], 00:10:30.465 "product_name": "Malloc disk", 00:10:30.465 "block_size": 512, 00:10:30.465 "num_blocks": 65536, 00:10:30.465 "uuid": "7b6d5a46-ad83-405e-9a95-d93124299157", 00:10:30.465 "assigned_rate_limits": { 00:10:30.465 "rw_ios_per_sec": 0, 00:10:30.465 "rw_mbytes_per_sec": 0, 00:10:30.465 "r_mbytes_per_sec": 0, 00:10:30.465 "w_mbytes_per_sec": 0 00:10:30.465 }, 00:10:30.465 "claimed": true, 00:10:30.465 "claim_type": "exclusive_write", 00:10:30.465 "zoned": false, 00:10:30.465 "supported_io_types": { 00:10:30.465 "read": true, 00:10:30.465 "write": true, 00:10:30.465 "unmap": true, 00:10:30.465 "flush": true, 00:10:30.465 "reset": true, 00:10:30.465 "nvme_admin": false, 00:10:30.465 "nvme_io": false, 00:10:30.465 "nvme_io_md": false, 00:10:30.465 "write_zeroes": true, 00:10:30.465 "zcopy": true, 00:10:30.465 "get_zone_info": false, 00:10:30.465 "zone_management": false, 00:10:30.465 "zone_append": false, 00:10:30.465 "compare": false, 00:10:30.465 "compare_and_write": false, 00:10:30.465 "abort": true, 00:10:30.465 "seek_hole": false, 00:10:30.465 "seek_data": false, 00:10:30.465 "copy": true, 00:10:30.465 "nvme_iov_md": false 00:10:30.465 }, 00:10:30.465 "memory_domains": [ 00:10:30.465 { 00:10:30.465 "dma_device_id": "system", 00:10:30.465 "dma_device_type": 1 00:10:30.465 }, 00:10:30.465 { 00:10:30.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:30.465 "dma_device_type": 2 00:10:30.465 } 00:10:30.465 ], 00:10:30.465 "driver_specific": {} 00:10:30.465 }' 00:10:30.465 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:30.465 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:30.725 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:30.725 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:30.725 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:30.725 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:30.725 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:30.725 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:30.725 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:30.725 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:30.725 17:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:30.985 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:30.985 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:30.985 [2024-07-15 17:22:42.222024] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:30.985 [2024-07-15 17:22:42.222045] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:30.986 [2024-07-15 17:22:42.222078] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:30.986 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:31.245 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:31.245 "name": "Existed_Raid", 00:10:31.245 "uuid": "2e74d68a-6291-4668-8cae-aa85a0318d1a", 00:10:31.245 "strip_size_kb": 64, 00:10:31.245 "state": "offline", 00:10:31.245 "raid_level": "raid0", 00:10:31.245 "superblock": true, 00:10:31.245 "num_base_bdevs": 2, 00:10:31.245 "num_base_bdevs_discovered": 1, 00:10:31.245 "num_base_bdevs_operational": 1, 00:10:31.245 "base_bdevs_list": [ 00:10:31.245 { 00:10:31.245 "name": null, 00:10:31.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:31.245 "is_configured": false, 00:10:31.245 "data_offset": 2048, 00:10:31.245 "data_size": 63488 00:10:31.245 }, 00:10:31.245 { 00:10:31.245 "name": "BaseBdev2", 00:10:31.245 "uuid": "7b6d5a46-ad83-405e-9a95-d93124299157", 00:10:31.246 "is_configured": true, 00:10:31.246 "data_offset": 2048, 00:10:31.246 "data_size": 63488 00:10:31.246 } 00:10:31.246 ] 00:10:31.246 }' 00:10:31.246 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:31.246 17:22:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:31.815 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:31.815 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:31.815 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:31.815 17:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:32.076 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:32.076 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:32.076 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:32.076 [2024-07-15 17:22:43.328826] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:32.076 [2024-07-15 17:22:43.328868] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a6d90 name Existed_Raid, state offline 00:10:32.076 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:32.076 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:32.076 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.076 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2741676 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2741676 ']' 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2741676 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2741676 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2741676' 00:10:32.335 killing process with pid 2741676 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2741676 00:10:32.335 [2024-07-15 17:22:43.575617] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:32.335 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2741676 00:10:32.335 [2024-07-15 17:22:43.576217] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:32.595 17:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:32.595 00:10:32.595 real 0m8.894s 00:10:32.595 user 0m16.140s 00:10:32.595 sys 0m1.367s 00:10:32.595 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:32.595 17:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:32.595 ************************************ 00:10:32.595 END TEST raid_state_function_test_sb 00:10:32.595 ************************************ 00:10:32.595 17:22:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:32.595 17:22:43 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:10:32.595 17:22:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:32.595 17:22:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:32.595 17:22:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:32.595 ************************************ 00:10:32.595 START TEST raid_superblock_test 00:10:32.595 ************************************ 00:10:32.595 17:22:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:10:32.595 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:10:32.595 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2743386 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2743386 /var/tmp/spdk-raid.sock 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2743386 ']' 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:32.596 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:32.596 17:22:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:32.596 [2024-07-15 17:22:43.868581] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:32.596 [2024-07-15 17:22:43.868704] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2743386 ] 00:10:32.855 [2024-07-15 17:22:44.010027] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:32.855 [2024-07-15 17:22:44.087478] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:32.855 [2024-07-15 17:22:44.131071] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:32.855 [2024-07-15 17:22:44.131095] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:33.424 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:33.684 malloc1 00:10:33.684 17:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:33.944 [2024-07-15 17:22:45.021895] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:33.944 [2024-07-15 17:22:45.021930] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:33.944 [2024-07-15 17:22:45.021942] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xba4a20 00:10:33.944 [2024-07-15 17:22:45.021948] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:33.944 [2024-07-15 17:22:45.023247] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:33.944 [2024-07-15 17:22:45.023267] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:33.944 pt1 00:10:33.944 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:33.944 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:33.944 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:33.944 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:33.944 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:33.944 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:33.944 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:33.944 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:33.944 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:33.944 malloc2 00:10:33.945 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:34.204 [2024-07-15 17:22:45.376775] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:34.204 [2024-07-15 17:22:45.376803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:34.204 [2024-07-15 17:22:45.376814] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xba5040 00:10:34.204 [2024-07-15 17:22:45.376820] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:34.204 [2024-07-15 17:22:45.377988] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:34.205 [2024-07-15 17:22:45.378006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:34.205 pt2 00:10:34.205 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:34.205 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:34.205 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:34.464 [2024-07-15 17:22:45.553231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:34.465 [2024-07-15 17:22:45.554206] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:34.465 [2024-07-15 17:22:45.554310] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd513d0 00:10:34.465 [2024-07-15 17:22:45.554317] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:34.465 [2024-07-15 17:22:45.554457] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd507f0 00:10:34.465 [2024-07-15 17:22:45.554561] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd513d0 00:10:34.465 [2024-07-15 17:22:45.554567] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd513d0 00:10:34.465 [2024-07-15 17:22:45.554634] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:34.465 "name": "raid_bdev1", 00:10:34.465 "uuid": "4ad57237-7e4b-459b-85c7-9c47506b0c72", 00:10:34.465 "strip_size_kb": 64, 00:10:34.465 "state": "online", 00:10:34.465 "raid_level": "raid0", 00:10:34.465 "superblock": true, 00:10:34.465 "num_base_bdevs": 2, 00:10:34.465 "num_base_bdevs_discovered": 2, 00:10:34.465 "num_base_bdevs_operational": 2, 00:10:34.465 "base_bdevs_list": [ 00:10:34.465 { 00:10:34.465 "name": "pt1", 00:10:34.465 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:34.465 "is_configured": true, 00:10:34.465 "data_offset": 2048, 00:10:34.465 "data_size": 63488 00:10:34.465 }, 00:10:34.465 { 00:10:34.465 "name": "pt2", 00:10:34.465 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:34.465 "is_configured": true, 00:10:34.465 "data_offset": 2048, 00:10:34.465 "data_size": 63488 00:10:34.465 } 00:10:34.465 ] 00:10:34.465 }' 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:34.465 17:22:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:35.034 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:35.034 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:35.034 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:35.034 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:35.034 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:35.034 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:35.034 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:35.034 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:35.294 [2024-07-15 17:22:46.471790] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:35.294 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:35.294 "name": "raid_bdev1", 00:10:35.294 "aliases": [ 00:10:35.294 "4ad57237-7e4b-459b-85c7-9c47506b0c72" 00:10:35.294 ], 00:10:35.294 "product_name": "Raid Volume", 00:10:35.294 "block_size": 512, 00:10:35.294 "num_blocks": 126976, 00:10:35.294 "uuid": "4ad57237-7e4b-459b-85c7-9c47506b0c72", 00:10:35.294 "assigned_rate_limits": { 00:10:35.294 "rw_ios_per_sec": 0, 00:10:35.294 "rw_mbytes_per_sec": 0, 00:10:35.294 "r_mbytes_per_sec": 0, 00:10:35.294 "w_mbytes_per_sec": 0 00:10:35.294 }, 00:10:35.294 "claimed": false, 00:10:35.294 "zoned": false, 00:10:35.294 "supported_io_types": { 00:10:35.294 "read": true, 00:10:35.294 "write": true, 00:10:35.294 "unmap": true, 00:10:35.294 "flush": true, 00:10:35.294 "reset": true, 00:10:35.294 "nvme_admin": false, 00:10:35.294 "nvme_io": false, 00:10:35.294 "nvme_io_md": false, 00:10:35.294 "write_zeroes": true, 00:10:35.294 "zcopy": false, 00:10:35.294 "get_zone_info": false, 00:10:35.294 "zone_management": false, 00:10:35.294 "zone_append": false, 00:10:35.294 "compare": false, 00:10:35.294 "compare_and_write": false, 00:10:35.294 "abort": false, 00:10:35.294 "seek_hole": false, 00:10:35.294 "seek_data": false, 00:10:35.294 "copy": false, 00:10:35.294 "nvme_iov_md": false 00:10:35.294 }, 00:10:35.294 "memory_domains": [ 00:10:35.294 { 00:10:35.294 "dma_device_id": "system", 00:10:35.294 "dma_device_type": 1 00:10:35.294 }, 00:10:35.294 { 00:10:35.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.294 "dma_device_type": 2 00:10:35.294 }, 00:10:35.294 { 00:10:35.294 "dma_device_id": "system", 00:10:35.294 "dma_device_type": 1 00:10:35.294 }, 00:10:35.294 { 00:10:35.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.294 "dma_device_type": 2 00:10:35.294 } 00:10:35.294 ], 00:10:35.294 "driver_specific": { 00:10:35.294 "raid": { 00:10:35.294 "uuid": "4ad57237-7e4b-459b-85c7-9c47506b0c72", 00:10:35.294 "strip_size_kb": 64, 00:10:35.294 "state": "online", 00:10:35.294 "raid_level": "raid0", 00:10:35.294 "superblock": true, 00:10:35.294 "num_base_bdevs": 2, 00:10:35.294 "num_base_bdevs_discovered": 2, 00:10:35.294 "num_base_bdevs_operational": 2, 00:10:35.294 "base_bdevs_list": [ 00:10:35.294 { 00:10:35.294 "name": "pt1", 00:10:35.294 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:35.294 "is_configured": true, 00:10:35.294 "data_offset": 2048, 00:10:35.294 "data_size": 63488 00:10:35.294 }, 00:10:35.294 { 00:10:35.294 "name": "pt2", 00:10:35.294 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:35.294 "is_configured": true, 00:10:35.294 "data_offset": 2048, 00:10:35.294 "data_size": 63488 00:10:35.294 } 00:10:35.294 ] 00:10:35.294 } 00:10:35.294 } 00:10:35.295 }' 00:10:35.295 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:35.295 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:35.295 pt2' 00:10:35.295 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:35.295 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:35.295 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:35.554 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:35.554 "name": "pt1", 00:10:35.554 "aliases": [ 00:10:35.554 "00000000-0000-0000-0000-000000000001" 00:10:35.554 ], 00:10:35.554 "product_name": "passthru", 00:10:35.554 "block_size": 512, 00:10:35.554 "num_blocks": 65536, 00:10:35.554 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:35.554 "assigned_rate_limits": { 00:10:35.554 "rw_ios_per_sec": 0, 00:10:35.554 "rw_mbytes_per_sec": 0, 00:10:35.554 "r_mbytes_per_sec": 0, 00:10:35.554 "w_mbytes_per_sec": 0 00:10:35.554 }, 00:10:35.554 "claimed": true, 00:10:35.554 "claim_type": "exclusive_write", 00:10:35.554 "zoned": false, 00:10:35.554 "supported_io_types": { 00:10:35.554 "read": true, 00:10:35.554 "write": true, 00:10:35.554 "unmap": true, 00:10:35.554 "flush": true, 00:10:35.554 "reset": true, 00:10:35.554 "nvme_admin": false, 00:10:35.554 "nvme_io": false, 00:10:35.555 "nvme_io_md": false, 00:10:35.555 "write_zeroes": true, 00:10:35.555 "zcopy": true, 00:10:35.555 "get_zone_info": false, 00:10:35.555 "zone_management": false, 00:10:35.555 "zone_append": false, 00:10:35.555 "compare": false, 00:10:35.555 "compare_and_write": false, 00:10:35.555 "abort": true, 00:10:35.555 "seek_hole": false, 00:10:35.555 "seek_data": false, 00:10:35.555 "copy": true, 00:10:35.555 "nvme_iov_md": false 00:10:35.555 }, 00:10:35.555 "memory_domains": [ 00:10:35.555 { 00:10:35.555 "dma_device_id": "system", 00:10:35.555 "dma_device_type": 1 00:10:35.555 }, 00:10:35.555 { 00:10:35.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.555 "dma_device_type": 2 00:10:35.555 } 00:10:35.555 ], 00:10:35.555 "driver_specific": { 00:10:35.555 "passthru": { 00:10:35.555 "name": "pt1", 00:10:35.555 "base_bdev_name": "malloc1" 00:10:35.555 } 00:10:35.555 } 00:10:35.555 }' 00:10:35.555 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.555 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.555 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:35.555 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:35.555 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:35.814 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:35.814 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:35.814 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:35.814 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:35.814 17:22:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:35.814 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:35.814 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:35.814 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:35.815 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:35.815 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:36.074 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:36.074 "name": "pt2", 00:10:36.074 "aliases": [ 00:10:36.074 "00000000-0000-0000-0000-000000000002" 00:10:36.074 ], 00:10:36.074 "product_name": "passthru", 00:10:36.074 "block_size": 512, 00:10:36.074 "num_blocks": 65536, 00:10:36.074 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:36.074 "assigned_rate_limits": { 00:10:36.074 "rw_ios_per_sec": 0, 00:10:36.074 "rw_mbytes_per_sec": 0, 00:10:36.074 "r_mbytes_per_sec": 0, 00:10:36.074 "w_mbytes_per_sec": 0 00:10:36.074 }, 00:10:36.074 "claimed": true, 00:10:36.074 "claim_type": "exclusive_write", 00:10:36.074 "zoned": false, 00:10:36.074 "supported_io_types": { 00:10:36.074 "read": true, 00:10:36.074 "write": true, 00:10:36.074 "unmap": true, 00:10:36.074 "flush": true, 00:10:36.074 "reset": true, 00:10:36.074 "nvme_admin": false, 00:10:36.074 "nvme_io": false, 00:10:36.074 "nvme_io_md": false, 00:10:36.074 "write_zeroes": true, 00:10:36.074 "zcopy": true, 00:10:36.074 "get_zone_info": false, 00:10:36.074 "zone_management": false, 00:10:36.074 "zone_append": false, 00:10:36.074 "compare": false, 00:10:36.074 "compare_and_write": false, 00:10:36.074 "abort": true, 00:10:36.074 "seek_hole": false, 00:10:36.074 "seek_data": false, 00:10:36.074 "copy": true, 00:10:36.074 "nvme_iov_md": false 00:10:36.074 }, 00:10:36.074 "memory_domains": [ 00:10:36.074 { 00:10:36.074 "dma_device_id": "system", 00:10:36.074 "dma_device_type": 1 00:10:36.074 }, 00:10:36.074 { 00:10:36.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.074 "dma_device_type": 2 00:10:36.074 } 00:10:36.074 ], 00:10:36.074 "driver_specific": { 00:10:36.075 "passthru": { 00:10:36.075 "name": "pt2", 00:10:36.075 "base_bdev_name": "malloc2" 00:10:36.075 } 00:10:36.075 } 00:10:36.075 }' 00:10:36.075 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:36.075 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:36.075 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:36.075 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:36.335 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:36.594 [2024-07-15 17:22:47.775074] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:36.594 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=4ad57237-7e4b-459b-85c7-9c47506b0c72 00:10:36.594 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 4ad57237-7e4b-459b-85c7-9c47506b0c72 ']' 00:10:36.594 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:36.854 [2024-07-15 17:22:47.967368] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:36.854 [2024-07-15 17:22:47.967384] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:36.854 [2024-07-15 17:22:47.967419] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:36.854 [2024-07-15 17:22:47.967450] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:36.854 [2024-07-15 17:22:47.967457] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd513d0 name raid_bdev1, state offline 00:10:36.854 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.854 17:22:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:37.113 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:37.113 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:37.113 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:37.113 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:37.113 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:37.113 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:37.373 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:37.373 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:37.632 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:37.633 [2024-07-15 17:22:48.929786] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:37.892 [2024-07-15 17:22:48.930853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:37.892 [2024-07-15 17:22:48.930893] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:37.892 [2024-07-15 17:22:48.930920] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:37.892 [2024-07-15 17:22:48.930930] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:37.892 [2024-07-15 17:22:48.930936] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xba4070 name raid_bdev1, state configuring 00:10:37.892 request: 00:10:37.892 { 00:10:37.892 "name": "raid_bdev1", 00:10:37.892 "raid_level": "raid0", 00:10:37.892 "base_bdevs": [ 00:10:37.892 "malloc1", 00:10:37.892 "malloc2" 00:10:37.892 ], 00:10:37.892 "strip_size_kb": 64, 00:10:37.892 "superblock": false, 00:10:37.892 "method": "bdev_raid_create", 00:10:37.892 "req_id": 1 00:10:37.892 } 00:10:37.892 Got JSON-RPC error response 00:10:37.892 response: 00:10:37.892 { 00:10:37.892 "code": -17, 00:10:37.892 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:37.892 } 00:10:37.892 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:37.892 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:37.892 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:37.892 17:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:37.892 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.892 17:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:37.892 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:37.892 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:37.892 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:38.152 [2024-07-15 17:22:49.298662] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:38.152 [2024-07-15 17:22:49.298692] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:38.152 [2024-07-15 17:22:49.298704] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xba5e00 00:10:38.152 [2024-07-15 17:22:49.298716] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:38.152 [2024-07-15 17:22:49.299968] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:38.152 [2024-07-15 17:22:49.299987] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:38.152 [2024-07-15 17:22:49.300029] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:38.152 [2024-07-15 17:22:49.300047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:38.152 pt1 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.152 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:38.412 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:38.413 "name": "raid_bdev1", 00:10:38.413 "uuid": "4ad57237-7e4b-459b-85c7-9c47506b0c72", 00:10:38.413 "strip_size_kb": 64, 00:10:38.413 "state": "configuring", 00:10:38.413 "raid_level": "raid0", 00:10:38.413 "superblock": true, 00:10:38.413 "num_base_bdevs": 2, 00:10:38.413 "num_base_bdevs_discovered": 1, 00:10:38.413 "num_base_bdevs_operational": 2, 00:10:38.413 "base_bdevs_list": [ 00:10:38.413 { 00:10:38.413 "name": "pt1", 00:10:38.413 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:38.413 "is_configured": true, 00:10:38.413 "data_offset": 2048, 00:10:38.413 "data_size": 63488 00:10:38.413 }, 00:10:38.413 { 00:10:38.413 "name": null, 00:10:38.413 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:38.413 "is_configured": false, 00:10:38.413 "data_offset": 2048, 00:10:38.413 "data_size": 63488 00:10:38.413 } 00:10:38.413 ] 00:10:38.413 }' 00:10:38.413 17:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:38.413 17:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:38.982 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:38.982 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:38.982 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:38.982 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:38.982 [2024-07-15 17:22:50.208981] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:38.982 [2024-07-15 17:22:50.209021] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:38.982 [2024-07-15 17:22:50.209037] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xba53e0 00:10:38.982 [2024-07-15 17:22:50.209043] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:38.982 [2024-07-15 17:22:50.209312] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:38.982 [2024-07-15 17:22:50.209322] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:38.983 [2024-07-15 17:22:50.209366] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:38.983 [2024-07-15 17:22:50.209379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:38.983 [2024-07-15 17:22:50.209453] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xba3680 00:10:38.983 [2024-07-15 17:22:50.209459] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:38.983 [2024-07-15 17:22:50.209590] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd507f0 00:10:38.983 [2024-07-15 17:22:50.209685] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xba3680 00:10:38.983 [2024-07-15 17:22:50.209690] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xba3680 00:10:38.983 [2024-07-15 17:22:50.209768] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:38.983 pt2 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.983 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:39.242 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:39.243 "name": "raid_bdev1", 00:10:39.243 "uuid": "4ad57237-7e4b-459b-85c7-9c47506b0c72", 00:10:39.243 "strip_size_kb": 64, 00:10:39.243 "state": "online", 00:10:39.243 "raid_level": "raid0", 00:10:39.243 "superblock": true, 00:10:39.243 "num_base_bdevs": 2, 00:10:39.243 "num_base_bdevs_discovered": 2, 00:10:39.243 "num_base_bdevs_operational": 2, 00:10:39.243 "base_bdevs_list": [ 00:10:39.243 { 00:10:39.243 "name": "pt1", 00:10:39.243 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:39.243 "is_configured": true, 00:10:39.243 "data_offset": 2048, 00:10:39.243 "data_size": 63488 00:10:39.243 }, 00:10:39.243 { 00:10:39.243 "name": "pt2", 00:10:39.243 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:39.243 "is_configured": true, 00:10:39.243 "data_offset": 2048, 00:10:39.243 "data_size": 63488 00:10:39.243 } 00:10:39.243 ] 00:10:39.243 }' 00:10:39.243 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:39.243 17:22:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:39.813 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:39.813 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:39.813 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:39.813 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:39.813 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:39.813 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:39.813 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:39.813 17:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:40.075 [2024-07-15 17:22:51.111448] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:40.075 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:40.075 "name": "raid_bdev1", 00:10:40.075 "aliases": [ 00:10:40.075 "4ad57237-7e4b-459b-85c7-9c47506b0c72" 00:10:40.075 ], 00:10:40.075 "product_name": "Raid Volume", 00:10:40.075 "block_size": 512, 00:10:40.075 "num_blocks": 126976, 00:10:40.075 "uuid": "4ad57237-7e4b-459b-85c7-9c47506b0c72", 00:10:40.075 "assigned_rate_limits": { 00:10:40.075 "rw_ios_per_sec": 0, 00:10:40.075 "rw_mbytes_per_sec": 0, 00:10:40.075 "r_mbytes_per_sec": 0, 00:10:40.075 "w_mbytes_per_sec": 0 00:10:40.075 }, 00:10:40.075 "claimed": false, 00:10:40.075 "zoned": false, 00:10:40.075 "supported_io_types": { 00:10:40.075 "read": true, 00:10:40.075 "write": true, 00:10:40.075 "unmap": true, 00:10:40.075 "flush": true, 00:10:40.075 "reset": true, 00:10:40.075 "nvme_admin": false, 00:10:40.075 "nvme_io": false, 00:10:40.075 "nvme_io_md": false, 00:10:40.075 "write_zeroes": true, 00:10:40.075 "zcopy": false, 00:10:40.075 "get_zone_info": false, 00:10:40.075 "zone_management": false, 00:10:40.075 "zone_append": false, 00:10:40.075 "compare": false, 00:10:40.075 "compare_and_write": false, 00:10:40.075 "abort": false, 00:10:40.075 "seek_hole": false, 00:10:40.075 "seek_data": false, 00:10:40.075 "copy": false, 00:10:40.075 "nvme_iov_md": false 00:10:40.075 }, 00:10:40.075 "memory_domains": [ 00:10:40.075 { 00:10:40.075 "dma_device_id": "system", 00:10:40.075 "dma_device_type": 1 00:10:40.075 }, 00:10:40.075 { 00:10:40.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.075 "dma_device_type": 2 00:10:40.075 }, 00:10:40.075 { 00:10:40.075 "dma_device_id": "system", 00:10:40.075 "dma_device_type": 1 00:10:40.075 }, 00:10:40.075 { 00:10:40.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.075 "dma_device_type": 2 00:10:40.075 } 00:10:40.075 ], 00:10:40.075 "driver_specific": { 00:10:40.075 "raid": { 00:10:40.075 "uuid": "4ad57237-7e4b-459b-85c7-9c47506b0c72", 00:10:40.075 "strip_size_kb": 64, 00:10:40.075 "state": "online", 00:10:40.075 "raid_level": "raid0", 00:10:40.075 "superblock": true, 00:10:40.075 "num_base_bdevs": 2, 00:10:40.075 "num_base_bdevs_discovered": 2, 00:10:40.075 "num_base_bdevs_operational": 2, 00:10:40.075 "base_bdevs_list": [ 00:10:40.075 { 00:10:40.075 "name": "pt1", 00:10:40.075 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:40.075 "is_configured": true, 00:10:40.075 "data_offset": 2048, 00:10:40.075 "data_size": 63488 00:10:40.075 }, 00:10:40.075 { 00:10:40.075 "name": "pt2", 00:10:40.075 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:40.075 "is_configured": true, 00:10:40.075 "data_offset": 2048, 00:10:40.075 "data_size": 63488 00:10:40.075 } 00:10:40.075 ] 00:10:40.075 } 00:10:40.075 } 00:10:40.075 }' 00:10:40.075 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:40.075 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:40.075 pt2' 00:10:40.075 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:40.075 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:40.075 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:40.075 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:40.075 "name": "pt1", 00:10:40.075 "aliases": [ 00:10:40.075 "00000000-0000-0000-0000-000000000001" 00:10:40.075 ], 00:10:40.075 "product_name": "passthru", 00:10:40.075 "block_size": 512, 00:10:40.075 "num_blocks": 65536, 00:10:40.075 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:40.075 "assigned_rate_limits": { 00:10:40.075 "rw_ios_per_sec": 0, 00:10:40.075 "rw_mbytes_per_sec": 0, 00:10:40.075 "r_mbytes_per_sec": 0, 00:10:40.075 "w_mbytes_per_sec": 0 00:10:40.075 }, 00:10:40.075 "claimed": true, 00:10:40.075 "claim_type": "exclusive_write", 00:10:40.075 "zoned": false, 00:10:40.075 "supported_io_types": { 00:10:40.075 "read": true, 00:10:40.075 "write": true, 00:10:40.075 "unmap": true, 00:10:40.075 "flush": true, 00:10:40.075 "reset": true, 00:10:40.075 "nvme_admin": false, 00:10:40.075 "nvme_io": false, 00:10:40.075 "nvme_io_md": false, 00:10:40.075 "write_zeroes": true, 00:10:40.075 "zcopy": true, 00:10:40.075 "get_zone_info": false, 00:10:40.075 "zone_management": false, 00:10:40.075 "zone_append": false, 00:10:40.075 "compare": false, 00:10:40.075 "compare_and_write": false, 00:10:40.075 "abort": true, 00:10:40.075 "seek_hole": false, 00:10:40.075 "seek_data": false, 00:10:40.075 "copy": true, 00:10:40.075 "nvme_iov_md": false 00:10:40.075 }, 00:10:40.075 "memory_domains": [ 00:10:40.075 { 00:10:40.075 "dma_device_id": "system", 00:10:40.075 "dma_device_type": 1 00:10:40.075 }, 00:10:40.075 { 00:10:40.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.075 "dma_device_type": 2 00:10:40.075 } 00:10:40.075 ], 00:10:40.075 "driver_specific": { 00:10:40.075 "passthru": { 00:10:40.075 "name": "pt1", 00:10:40.075 "base_bdev_name": "malloc1" 00:10:40.075 } 00:10:40.075 } 00:10:40.075 }' 00:10:40.336 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.336 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.336 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:40.336 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.336 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.336 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:40.336 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:40.336 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:40.595 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:40.595 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:40.595 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:40.595 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:40.595 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:40.595 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:40.595 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:40.595 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:40.595 "name": "pt2", 00:10:40.595 "aliases": [ 00:10:40.595 "00000000-0000-0000-0000-000000000002" 00:10:40.595 ], 00:10:40.595 "product_name": "passthru", 00:10:40.595 "block_size": 512, 00:10:40.595 "num_blocks": 65536, 00:10:40.595 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:40.595 "assigned_rate_limits": { 00:10:40.595 "rw_ios_per_sec": 0, 00:10:40.595 "rw_mbytes_per_sec": 0, 00:10:40.595 "r_mbytes_per_sec": 0, 00:10:40.595 "w_mbytes_per_sec": 0 00:10:40.595 }, 00:10:40.595 "claimed": true, 00:10:40.595 "claim_type": "exclusive_write", 00:10:40.595 "zoned": false, 00:10:40.595 "supported_io_types": { 00:10:40.595 "read": true, 00:10:40.595 "write": true, 00:10:40.595 "unmap": true, 00:10:40.595 "flush": true, 00:10:40.595 "reset": true, 00:10:40.595 "nvme_admin": false, 00:10:40.595 "nvme_io": false, 00:10:40.595 "nvme_io_md": false, 00:10:40.595 "write_zeroes": true, 00:10:40.595 "zcopy": true, 00:10:40.595 "get_zone_info": false, 00:10:40.595 "zone_management": false, 00:10:40.595 "zone_append": false, 00:10:40.595 "compare": false, 00:10:40.596 "compare_and_write": false, 00:10:40.596 "abort": true, 00:10:40.596 "seek_hole": false, 00:10:40.596 "seek_data": false, 00:10:40.596 "copy": true, 00:10:40.596 "nvme_iov_md": false 00:10:40.596 }, 00:10:40.596 "memory_domains": [ 00:10:40.596 { 00:10:40.596 "dma_device_id": "system", 00:10:40.596 "dma_device_type": 1 00:10:40.596 }, 00:10:40.596 { 00:10:40.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.596 "dma_device_type": 2 00:10:40.596 } 00:10:40.596 ], 00:10:40.596 "driver_specific": { 00:10:40.596 "passthru": { 00:10:40.596 "name": "pt2", 00:10:40.596 "base_bdev_name": "malloc2" 00:10:40.596 } 00:10:40.596 } 00:10:40.596 }' 00:10:40.596 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.854 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:40.854 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:40.854 17:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.854 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:40.854 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:40.854 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:40.854 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:41.114 [2024-07-15 17:22:52.390686] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 4ad57237-7e4b-459b-85c7-9c47506b0c72 '!=' 4ad57237-7e4b-459b-85c7-9c47506b0c72 ']' 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:41.114 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2743386 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2743386 ']' 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2743386 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2743386 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2743386' 00:10:41.373 killing process with pid 2743386 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2743386 00:10:41.373 [2024-07-15 17:22:52.468400] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:41.373 [2024-07-15 17:22:52.468437] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:41.373 [2024-07-15 17:22:52.468470] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:41.373 [2024-07-15 17:22:52.468476] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xba3680 name raid_bdev1, state offline 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2743386 00:10:41.373 [2024-07-15 17:22:52.477508] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:41.373 00:10:41.373 real 0m8.826s 00:10:41.373 user 0m16.081s 00:10:41.373 sys 0m1.369s 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:41.373 17:22:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:41.373 ************************************ 00:10:41.373 END TEST raid_superblock_test 00:10:41.373 ************************************ 00:10:41.373 17:22:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:41.373 17:22:52 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:10:41.373 17:22:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:41.373 17:22:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:41.373 17:22:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:41.633 ************************************ 00:10:41.633 START TEST raid_read_error_test 00:10:41.633 ************************************ 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:41.633 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ZXpky8PBw4 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2745127 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2745127 /var/tmp/spdk-raid.sock 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2745127 ']' 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:41.634 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:41.634 17:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:41.634 [2024-07-15 17:22:52.753509] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:41.634 [2024-07-15 17:22:52.753571] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2745127 ] 00:10:41.634 [2024-07-15 17:22:52.844675] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:41.634 [2024-07-15 17:22:52.913269] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.893 [2024-07-15 17:22:52.965182] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:41.893 [2024-07-15 17:22:52.965210] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:42.462 17:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:42.462 17:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:42.462 17:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:42.462 17:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:42.720 BaseBdev1_malloc 00:10:42.720 17:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:42.720 true 00:10:42.720 17:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:42.979 [2024-07-15 17:22:54.132103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:42.979 [2024-07-15 17:22:54.132134] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:42.979 [2024-07-15 17:22:54.132145] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246bb50 00:10:42.979 [2024-07-15 17:22:54.132151] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:42.979 [2024-07-15 17:22:54.133428] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:42.979 [2024-07-15 17:22:54.133447] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:42.979 BaseBdev1 00:10:42.979 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:42.979 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:43.238 BaseBdev2_malloc 00:10:43.238 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:43.238 true 00:10:43.238 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:43.497 [2024-07-15 17:22:54.691135] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:43.498 [2024-07-15 17:22:54.691163] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:43.498 [2024-07-15 17:22:54.691173] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x244fea0 00:10:43.498 [2024-07-15 17:22:54.691179] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:43.498 [2024-07-15 17:22:54.692319] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:43.498 [2024-07-15 17:22:54.692337] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:43.498 BaseBdev2 00:10:43.498 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:43.758 [2024-07-15 17:22:54.875623] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:43.758 [2024-07-15 17:22:54.876608] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:43.758 [2024-07-15 17:22:54.876749] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22b9360 00:10:43.758 [2024-07-15 17:22:54.876758] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:43.758 [2024-07-15 17:22:54.876895] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b68a0 00:10:43.758 [2024-07-15 17:22:54.877005] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22b9360 00:10:43.758 [2024-07-15 17:22:54.877011] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22b9360 00:10:43.758 [2024-07-15 17:22:54.877084] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.758 17:22:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:44.021 17:22:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:44.021 "name": "raid_bdev1", 00:10:44.021 "uuid": "81af77c8-8027-4637-bdb6-001a53272df5", 00:10:44.021 "strip_size_kb": 64, 00:10:44.021 "state": "online", 00:10:44.021 "raid_level": "raid0", 00:10:44.021 "superblock": true, 00:10:44.021 "num_base_bdevs": 2, 00:10:44.021 "num_base_bdevs_discovered": 2, 00:10:44.021 "num_base_bdevs_operational": 2, 00:10:44.021 "base_bdevs_list": [ 00:10:44.021 { 00:10:44.021 "name": "BaseBdev1", 00:10:44.021 "uuid": "3db37e06-7e65-5b1d-9379-e52ac27221aa", 00:10:44.021 "is_configured": true, 00:10:44.021 "data_offset": 2048, 00:10:44.021 "data_size": 63488 00:10:44.021 }, 00:10:44.021 { 00:10:44.021 "name": "BaseBdev2", 00:10:44.021 "uuid": "168eb2e4-943a-5af3-a65d-ba3662b3c53b", 00:10:44.021 "is_configured": true, 00:10:44.021 "data_offset": 2048, 00:10:44.021 "data_size": 63488 00:10:44.021 } 00:10:44.021 ] 00:10:44.021 }' 00:10:44.021 17:22:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:44.021 17:22:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:44.328 17:22:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:44.328 17:22:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:44.587 [2024-07-15 17:22:55.645805] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2451270 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.528 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:45.789 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:45.789 "name": "raid_bdev1", 00:10:45.789 "uuid": "81af77c8-8027-4637-bdb6-001a53272df5", 00:10:45.789 "strip_size_kb": 64, 00:10:45.789 "state": "online", 00:10:45.789 "raid_level": "raid0", 00:10:45.789 "superblock": true, 00:10:45.789 "num_base_bdevs": 2, 00:10:45.789 "num_base_bdevs_discovered": 2, 00:10:45.789 "num_base_bdevs_operational": 2, 00:10:45.789 "base_bdevs_list": [ 00:10:45.789 { 00:10:45.789 "name": "BaseBdev1", 00:10:45.789 "uuid": "3db37e06-7e65-5b1d-9379-e52ac27221aa", 00:10:45.789 "is_configured": true, 00:10:45.789 "data_offset": 2048, 00:10:45.789 "data_size": 63488 00:10:45.789 }, 00:10:45.789 { 00:10:45.789 "name": "BaseBdev2", 00:10:45.789 "uuid": "168eb2e4-943a-5af3-a65d-ba3662b3c53b", 00:10:45.789 "is_configured": true, 00:10:45.789 "data_offset": 2048, 00:10:45.789 "data_size": 63488 00:10:45.789 } 00:10:45.789 ] 00:10:45.789 }' 00:10:45.789 17:22:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:45.789 17:22:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.359 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:46.619 [2024-07-15 17:22:57.682242] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:46.619 [2024-07-15 17:22:57.682277] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:46.619 [2024-07-15 17:22:57.684862] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:46.619 [2024-07-15 17:22:57.684887] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:46.619 [2024-07-15 17:22:57.684907] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:46.619 [2024-07-15 17:22:57.684912] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b9360 name raid_bdev1, state offline 00:10:46.619 0 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2745127 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2745127 ']' 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2745127 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2745127 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2745127' 00:10:46.619 killing process with pid 2745127 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2745127 00:10:46.619 [2024-07-15 17:22:57.767212] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2745127 00:10:46.619 [2024-07-15 17:22:57.772768] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ZXpky8PBw4 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:10:46.619 00:10:46.619 real 0m5.233s 00:10:46.619 user 0m8.150s 00:10:46.619 sys 0m0.767s 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:46.619 17:22:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.619 ************************************ 00:10:46.619 END TEST raid_read_error_test 00:10:46.619 ************************************ 00:10:46.878 17:22:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:46.878 17:22:57 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:10:46.878 17:22:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:46.878 17:22:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:46.878 17:22:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:46.878 ************************************ 00:10:46.878 START TEST raid_write_error_test 00:10:46.878 ************************************ 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.DmBkDSZota 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2746129 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2746129 /var/tmp/spdk-raid.sock 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:46.878 17:22:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2746129 ']' 00:10:46.879 17:22:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:46.879 17:22:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:46.879 17:22:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:46.879 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:46.879 17:22:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:46.879 17:22:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.879 [2024-07-15 17:22:58.089280] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:46.879 [2024-07-15 17:22:58.089409] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2746129 ] 00:10:47.138 [2024-07-15 17:22:58.233730] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:47.138 [2024-07-15 17:22:58.310044] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:47.138 [2024-07-15 17:22:58.358151] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:47.138 [2024-07-15 17:22:58.358178] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:47.708 17:22:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:47.708 17:22:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:47.708 17:22:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:47.708 17:22:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:47.968 BaseBdev1_malloc 00:10:47.968 17:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:47.968 true 00:10:48.227 17:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:48.227 [2024-07-15 17:22:59.429517] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:48.227 [2024-07-15 17:22:59.429547] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:48.227 [2024-07-15 17:22:59.429559] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c43b50 00:10:48.227 [2024-07-15 17:22:59.429565] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:48.227 [2024-07-15 17:22:59.430883] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:48.227 [2024-07-15 17:22:59.430903] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:48.227 BaseBdev1 00:10:48.227 17:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:48.227 17:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:48.487 BaseBdev2_malloc 00:10:48.487 17:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:48.747 true 00:10:48.747 17:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:48.747 [2024-07-15 17:22:59.984745] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:48.747 [2024-07-15 17:22:59.984774] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:48.747 [2024-07-15 17:22:59.984785] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c27ea0 00:10:48.747 [2024-07-15 17:22:59.984791] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:48.747 [2024-07-15 17:22:59.985984] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:48.747 [2024-07-15 17:22:59.986004] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:48.747 BaseBdev2 00:10:48.747 17:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:49.007 [2024-07-15 17:23:00.177259] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:49.007 [2024-07-15 17:23:00.178337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:49.007 [2024-07-15 17:23:00.178486] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a91360 00:10:49.007 [2024-07-15 17:23:00.178494] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:49.007 [2024-07-15 17:23:00.178652] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a8e8a0 00:10:49.007 [2024-07-15 17:23:00.178781] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a91360 00:10:49.007 [2024-07-15 17:23:00.178788] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a91360 00:10:49.007 [2024-07-15 17:23:00.178870] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:49.007 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:49.267 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:49.267 "name": "raid_bdev1", 00:10:49.267 "uuid": "76c83af1-2af1-4af2-9767-9e010d582018", 00:10:49.267 "strip_size_kb": 64, 00:10:49.267 "state": "online", 00:10:49.267 "raid_level": "raid0", 00:10:49.267 "superblock": true, 00:10:49.267 "num_base_bdevs": 2, 00:10:49.267 "num_base_bdevs_discovered": 2, 00:10:49.267 "num_base_bdevs_operational": 2, 00:10:49.267 "base_bdevs_list": [ 00:10:49.267 { 00:10:49.267 "name": "BaseBdev1", 00:10:49.267 "uuid": "39674132-a379-5b19-b5e1-d1d6371eb56e", 00:10:49.267 "is_configured": true, 00:10:49.267 "data_offset": 2048, 00:10:49.267 "data_size": 63488 00:10:49.267 }, 00:10:49.267 { 00:10:49.267 "name": "BaseBdev2", 00:10:49.267 "uuid": "a1359e8e-2cf6-58c8-85f3-ac28ad91d899", 00:10:49.267 "is_configured": true, 00:10:49.267 "data_offset": 2048, 00:10:49.267 "data_size": 63488 00:10:49.267 } 00:10:49.267 ] 00:10:49.267 }' 00:10:49.267 17:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:49.267 17:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.836 17:23:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:49.836 17:23:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:49.836 [2024-07-15 17:23:01.111827] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c29270 00:10:50.784 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.045 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:51.305 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:51.305 "name": "raid_bdev1", 00:10:51.305 "uuid": "76c83af1-2af1-4af2-9767-9e010d582018", 00:10:51.305 "strip_size_kb": 64, 00:10:51.305 "state": "online", 00:10:51.305 "raid_level": "raid0", 00:10:51.305 "superblock": true, 00:10:51.305 "num_base_bdevs": 2, 00:10:51.305 "num_base_bdevs_discovered": 2, 00:10:51.305 "num_base_bdevs_operational": 2, 00:10:51.305 "base_bdevs_list": [ 00:10:51.305 { 00:10:51.305 "name": "BaseBdev1", 00:10:51.305 "uuid": "39674132-a379-5b19-b5e1-d1d6371eb56e", 00:10:51.305 "is_configured": true, 00:10:51.305 "data_offset": 2048, 00:10:51.305 "data_size": 63488 00:10:51.305 }, 00:10:51.305 { 00:10:51.305 "name": "BaseBdev2", 00:10:51.305 "uuid": "a1359e8e-2cf6-58c8-85f3-ac28ad91d899", 00:10:51.305 "is_configured": true, 00:10:51.305 "data_offset": 2048, 00:10:51.305 "data_size": 63488 00:10:51.305 } 00:10:51.305 ] 00:10:51.305 }' 00:10:51.305 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:51.305 17:23:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.873 17:23:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:51.873 [2024-07-15 17:23:03.127207] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:51.873 [2024-07-15 17:23:03.127235] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:51.873 [2024-07-15 17:23:03.129821] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:51.873 [2024-07-15 17:23:03.129845] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:51.873 [2024-07-15 17:23:03.129865] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:51.873 [2024-07-15 17:23:03.129871] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a91360 name raid_bdev1, state offline 00:10:51.873 0 00:10:51.873 17:23:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2746129 00:10:51.873 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2746129 ']' 00:10:51.873 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2746129 00:10:51.873 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:10:51.873 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:51.873 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2746129 00:10:52.136 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:52.136 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:52.136 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2746129' 00:10:52.136 killing process with pid 2746129 00:10:52.136 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2746129 00:10:52.136 [2024-07-15 17:23:03.212445] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:52.136 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2746129 00:10:52.136 [2024-07-15 17:23:03.217885] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:52.136 17:23:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.DmBkDSZota 00:10:52.136 17:23:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:52.136 17:23:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:52.137 17:23:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:10:52.137 17:23:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:10:52.137 17:23:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:52.137 17:23:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:52.137 17:23:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:10:52.137 00:10:52.137 real 0m5.372s 00:10:52.137 user 0m8.425s 00:10:52.137 sys 0m0.793s 00:10:52.137 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:52.137 17:23:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.137 ************************************ 00:10:52.137 END TEST raid_write_error_test 00:10:52.137 ************************************ 00:10:52.137 17:23:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:52.137 17:23:03 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:52.137 17:23:03 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:10:52.137 17:23:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:52.137 17:23:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:52.137 17:23:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:52.137 ************************************ 00:10:52.137 START TEST raid_state_function_test 00:10:52.137 ************************************ 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2747143 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2747143' 00:10:52.137 Process raid pid: 2747143 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2747143 /var/tmp/spdk-raid.sock 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2747143 ']' 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:52.137 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:52.137 17:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.396 [2024-07-15 17:23:03.490689] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:10:52.396 [2024-07-15 17:23:03.490757] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:52.396 [2024-07-15 17:23:03.578444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:52.396 [2024-07-15 17:23:03.644519] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:52.396 [2024-07-15 17:23:03.683673] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:52.396 [2024-07-15 17:23:03.683691] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:53.777 17:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:53.777 17:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:53.777 17:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:54.049 [2024-07-15 17:23:05.176878] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:54.049 [2024-07-15 17:23:05.176908] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:54.049 [2024-07-15 17:23:05.176914] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:54.049 [2024-07-15 17:23:05.176920] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.049 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:54.312 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:54.312 "name": "Existed_Raid", 00:10:54.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:54.312 "strip_size_kb": 64, 00:10:54.312 "state": "configuring", 00:10:54.312 "raid_level": "concat", 00:10:54.312 "superblock": false, 00:10:54.312 "num_base_bdevs": 2, 00:10:54.312 "num_base_bdevs_discovered": 0, 00:10:54.312 "num_base_bdevs_operational": 2, 00:10:54.312 "base_bdevs_list": [ 00:10:54.312 { 00:10:54.312 "name": "BaseBdev1", 00:10:54.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:54.312 "is_configured": false, 00:10:54.312 "data_offset": 0, 00:10:54.312 "data_size": 0 00:10:54.312 }, 00:10:54.312 { 00:10:54.312 "name": "BaseBdev2", 00:10:54.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:54.312 "is_configured": false, 00:10:54.312 "data_offset": 0, 00:10:54.312 "data_size": 0 00:10:54.312 } 00:10:54.312 ] 00:10:54.312 }' 00:10:54.312 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:54.312 17:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.879 17:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:54.879 [2024-07-15 17:23:06.103130] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:54.879 [2024-07-15 17:23:06.103150] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9176b0 name Existed_Raid, state configuring 00:10:54.879 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:55.139 [2024-07-15 17:23:06.299637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:55.139 [2024-07-15 17:23:06.299654] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:55.139 [2024-07-15 17:23:06.299659] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:55.139 [2024-07-15 17:23:06.299665] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:55.139 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:55.398 [2024-07-15 17:23:06.490672] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:55.398 BaseBdev1 00:10:55.398 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:55.398 17:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:55.398 17:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:55.398 17:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:55.398 17:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:55.398 17:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:55.398 17:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:55.398 17:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:55.657 [ 00:10:55.657 { 00:10:55.657 "name": "BaseBdev1", 00:10:55.657 "aliases": [ 00:10:55.657 "9fa55f4d-3589-4f38-bd6f-e0fdc62ce0a0" 00:10:55.657 ], 00:10:55.657 "product_name": "Malloc disk", 00:10:55.657 "block_size": 512, 00:10:55.657 "num_blocks": 65536, 00:10:55.657 "uuid": "9fa55f4d-3589-4f38-bd6f-e0fdc62ce0a0", 00:10:55.657 "assigned_rate_limits": { 00:10:55.657 "rw_ios_per_sec": 0, 00:10:55.657 "rw_mbytes_per_sec": 0, 00:10:55.657 "r_mbytes_per_sec": 0, 00:10:55.657 "w_mbytes_per_sec": 0 00:10:55.657 }, 00:10:55.657 "claimed": true, 00:10:55.657 "claim_type": "exclusive_write", 00:10:55.657 "zoned": false, 00:10:55.657 "supported_io_types": { 00:10:55.657 "read": true, 00:10:55.657 "write": true, 00:10:55.657 "unmap": true, 00:10:55.657 "flush": true, 00:10:55.657 "reset": true, 00:10:55.657 "nvme_admin": false, 00:10:55.657 "nvme_io": false, 00:10:55.657 "nvme_io_md": false, 00:10:55.657 "write_zeroes": true, 00:10:55.657 "zcopy": true, 00:10:55.657 "get_zone_info": false, 00:10:55.657 "zone_management": false, 00:10:55.657 "zone_append": false, 00:10:55.657 "compare": false, 00:10:55.657 "compare_and_write": false, 00:10:55.657 "abort": true, 00:10:55.657 "seek_hole": false, 00:10:55.657 "seek_data": false, 00:10:55.657 "copy": true, 00:10:55.657 "nvme_iov_md": false 00:10:55.657 }, 00:10:55.657 "memory_domains": [ 00:10:55.657 { 00:10:55.657 "dma_device_id": "system", 00:10:55.657 "dma_device_type": 1 00:10:55.657 }, 00:10:55.657 { 00:10:55.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.657 "dma_device_type": 2 00:10:55.657 } 00:10:55.657 ], 00:10:55.657 "driver_specific": {} 00:10:55.657 } 00:10:55.657 ] 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:55.657 17:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:55.916 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:55.916 "name": "Existed_Raid", 00:10:55.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:55.916 "strip_size_kb": 64, 00:10:55.916 "state": "configuring", 00:10:55.916 "raid_level": "concat", 00:10:55.916 "superblock": false, 00:10:55.916 "num_base_bdevs": 2, 00:10:55.916 "num_base_bdevs_discovered": 1, 00:10:55.916 "num_base_bdevs_operational": 2, 00:10:55.916 "base_bdevs_list": [ 00:10:55.916 { 00:10:55.916 "name": "BaseBdev1", 00:10:55.916 "uuid": "9fa55f4d-3589-4f38-bd6f-e0fdc62ce0a0", 00:10:55.916 "is_configured": true, 00:10:55.916 "data_offset": 0, 00:10:55.916 "data_size": 65536 00:10:55.916 }, 00:10:55.916 { 00:10:55.916 "name": "BaseBdev2", 00:10:55.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:55.916 "is_configured": false, 00:10:55.916 "data_offset": 0, 00:10:55.916 "data_size": 0 00:10:55.916 } 00:10:55.916 ] 00:10:55.916 }' 00:10:55.916 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:55.916 17:23:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:56.485 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:56.745 [2024-07-15 17:23:07.805994] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:56.745 [2024-07-15 17:23:07.806020] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x916fa0 name Existed_Raid, state configuring 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:56.745 [2024-07-15 17:23:07.966424] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:56.745 [2024-07-15 17:23:07.967572] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:56.745 [2024-07-15 17:23:07.967594] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.745 17:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:57.005 17:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:57.005 "name": "Existed_Raid", 00:10:57.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:57.005 "strip_size_kb": 64, 00:10:57.005 "state": "configuring", 00:10:57.005 "raid_level": "concat", 00:10:57.005 "superblock": false, 00:10:57.005 "num_base_bdevs": 2, 00:10:57.005 "num_base_bdevs_discovered": 1, 00:10:57.005 "num_base_bdevs_operational": 2, 00:10:57.005 "base_bdevs_list": [ 00:10:57.005 { 00:10:57.005 "name": "BaseBdev1", 00:10:57.005 "uuid": "9fa55f4d-3589-4f38-bd6f-e0fdc62ce0a0", 00:10:57.005 "is_configured": true, 00:10:57.005 "data_offset": 0, 00:10:57.005 "data_size": 65536 00:10:57.005 }, 00:10:57.005 { 00:10:57.005 "name": "BaseBdev2", 00:10:57.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:57.005 "is_configured": false, 00:10:57.005 "data_offset": 0, 00:10:57.005 "data_size": 0 00:10:57.005 } 00:10:57.005 ] 00:10:57.005 }' 00:10:57.005 17:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:57.005 17:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:57.576 17:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:57.836 [2024-07-15 17:23:08.917632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:57.836 [2024-07-15 17:23:08.917656] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x917d90 00:10:57.836 [2024-07-15 17:23:08.917662] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:57.836 [2024-07-15 17:23:08.917815] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xabb730 00:10:57.836 [2024-07-15 17:23:08.917906] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x917d90 00:10:57.836 [2024-07-15 17:23:08.917912] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x917d90 00:10:57.836 [2024-07-15 17:23:08.918031] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:57.836 BaseBdev2 00:10:57.836 17:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:57.836 17:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:57.836 17:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:57.836 17:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:57.836 17:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:57.836 17:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:57.836 17:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:57.836 17:23:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:58.096 [ 00:10:58.096 { 00:10:58.096 "name": "BaseBdev2", 00:10:58.096 "aliases": [ 00:10:58.096 "de7da525-53a2-4bc1-b8b3-cf784ad03a44" 00:10:58.096 ], 00:10:58.096 "product_name": "Malloc disk", 00:10:58.096 "block_size": 512, 00:10:58.096 "num_blocks": 65536, 00:10:58.096 "uuid": "de7da525-53a2-4bc1-b8b3-cf784ad03a44", 00:10:58.096 "assigned_rate_limits": { 00:10:58.096 "rw_ios_per_sec": 0, 00:10:58.096 "rw_mbytes_per_sec": 0, 00:10:58.096 "r_mbytes_per_sec": 0, 00:10:58.096 "w_mbytes_per_sec": 0 00:10:58.096 }, 00:10:58.096 "claimed": true, 00:10:58.096 "claim_type": "exclusive_write", 00:10:58.096 "zoned": false, 00:10:58.096 "supported_io_types": { 00:10:58.096 "read": true, 00:10:58.096 "write": true, 00:10:58.096 "unmap": true, 00:10:58.096 "flush": true, 00:10:58.096 "reset": true, 00:10:58.096 "nvme_admin": false, 00:10:58.096 "nvme_io": false, 00:10:58.096 "nvme_io_md": false, 00:10:58.096 "write_zeroes": true, 00:10:58.096 "zcopy": true, 00:10:58.096 "get_zone_info": false, 00:10:58.096 "zone_management": false, 00:10:58.096 "zone_append": false, 00:10:58.096 "compare": false, 00:10:58.096 "compare_and_write": false, 00:10:58.096 "abort": true, 00:10:58.096 "seek_hole": false, 00:10:58.096 "seek_data": false, 00:10:58.096 "copy": true, 00:10:58.096 "nvme_iov_md": false 00:10:58.096 }, 00:10:58.096 "memory_domains": [ 00:10:58.096 { 00:10:58.096 "dma_device_id": "system", 00:10:58.096 "dma_device_type": 1 00:10:58.096 }, 00:10:58.096 { 00:10:58.096 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:58.096 "dma_device_type": 2 00:10:58.096 } 00:10:58.096 ], 00:10:58.096 "driver_specific": {} 00:10:58.096 } 00:10:58.096 ] 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.096 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:58.355 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:58.355 "name": "Existed_Raid", 00:10:58.355 "uuid": "c8b6b68b-20e3-403c-86af-1194dc6c0e9c", 00:10:58.355 "strip_size_kb": 64, 00:10:58.355 "state": "online", 00:10:58.355 "raid_level": "concat", 00:10:58.355 "superblock": false, 00:10:58.355 "num_base_bdevs": 2, 00:10:58.355 "num_base_bdevs_discovered": 2, 00:10:58.355 "num_base_bdevs_operational": 2, 00:10:58.355 "base_bdevs_list": [ 00:10:58.355 { 00:10:58.355 "name": "BaseBdev1", 00:10:58.355 "uuid": "9fa55f4d-3589-4f38-bd6f-e0fdc62ce0a0", 00:10:58.355 "is_configured": true, 00:10:58.355 "data_offset": 0, 00:10:58.355 "data_size": 65536 00:10:58.355 }, 00:10:58.355 { 00:10:58.355 "name": "BaseBdev2", 00:10:58.355 "uuid": "de7da525-53a2-4bc1-b8b3-cf784ad03a44", 00:10:58.355 "is_configured": true, 00:10:58.355 "data_offset": 0, 00:10:58.355 "data_size": 65536 00:10:58.355 } 00:10:58.355 ] 00:10:58.355 }' 00:10:58.355 17:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:58.355 17:23:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.968 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:58.968 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:58.968 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:58.968 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:58.968 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:58.968 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:58.968 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:58.968 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:58.968 [2024-07-15 17:23:10.257266] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:59.229 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:59.229 "name": "Existed_Raid", 00:10:59.229 "aliases": [ 00:10:59.229 "c8b6b68b-20e3-403c-86af-1194dc6c0e9c" 00:10:59.229 ], 00:10:59.229 "product_name": "Raid Volume", 00:10:59.229 "block_size": 512, 00:10:59.229 "num_blocks": 131072, 00:10:59.229 "uuid": "c8b6b68b-20e3-403c-86af-1194dc6c0e9c", 00:10:59.229 "assigned_rate_limits": { 00:10:59.229 "rw_ios_per_sec": 0, 00:10:59.229 "rw_mbytes_per_sec": 0, 00:10:59.229 "r_mbytes_per_sec": 0, 00:10:59.229 "w_mbytes_per_sec": 0 00:10:59.229 }, 00:10:59.229 "claimed": false, 00:10:59.229 "zoned": false, 00:10:59.229 "supported_io_types": { 00:10:59.229 "read": true, 00:10:59.229 "write": true, 00:10:59.229 "unmap": true, 00:10:59.229 "flush": true, 00:10:59.229 "reset": true, 00:10:59.229 "nvme_admin": false, 00:10:59.229 "nvme_io": false, 00:10:59.229 "nvme_io_md": false, 00:10:59.229 "write_zeroes": true, 00:10:59.229 "zcopy": false, 00:10:59.229 "get_zone_info": false, 00:10:59.229 "zone_management": false, 00:10:59.229 "zone_append": false, 00:10:59.229 "compare": false, 00:10:59.229 "compare_and_write": false, 00:10:59.229 "abort": false, 00:10:59.229 "seek_hole": false, 00:10:59.229 "seek_data": false, 00:10:59.229 "copy": false, 00:10:59.229 "nvme_iov_md": false 00:10:59.229 }, 00:10:59.229 "memory_domains": [ 00:10:59.229 { 00:10:59.229 "dma_device_id": "system", 00:10:59.229 "dma_device_type": 1 00:10:59.229 }, 00:10:59.229 { 00:10:59.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:59.229 "dma_device_type": 2 00:10:59.229 }, 00:10:59.229 { 00:10:59.229 "dma_device_id": "system", 00:10:59.229 "dma_device_type": 1 00:10:59.229 }, 00:10:59.229 { 00:10:59.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:59.229 "dma_device_type": 2 00:10:59.229 } 00:10:59.229 ], 00:10:59.229 "driver_specific": { 00:10:59.229 "raid": { 00:10:59.229 "uuid": "c8b6b68b-20e3-403c-86af-1194dc6c0e9c", 00:10:59.229 "strip_size_kb": 64, 00:10:59.229 "state": "online", 00:10:59.229 "raid_level": "concat", 00:10:59.229 "superblock": false, 00:10:59.229 "num_base_bdevs": 2, 00:10:59.229 "num_base_bdevs_discovered": 2, 00:10:59.229 "num_base_bdevs_operational": 2, 00:10:59.229 "base_bdevs_list": [ 00:10:59.229 { 00:10:59.229 "name": "BaseBdev1", 00:10:59.229 "uuid": "9fa55f4d-3589-4f38-bd6f-e0fdc62ce0a0", 00:10:59.229 "is_configured": true, 00:10:59.229 "data_offset": 0, 00:10:59.229 "data_size": 65536 00:10:59.229 }, 00:10:59.229 { 00:10:59.229 "name": "BaseBdev2", 00:10:59.229 "uuid": "de7da525-53a2-4bc1-b8b3-cf784ad03a44", 00:10:59.229 "is_configured": true, 00:10:59.229 "data_offset": 0, 00:10:59.229 "data_size": 65536 00:10:59.229 } 00:10:59.229 ] 00:10:59.229 } 00:10:59.229 } 00:10:59.229 }' 00:10:59.230 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:59.230 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:59.230 BaseBdev2' 00:10:59.230 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:59.230 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:59.230 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:59.230 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:59.230 "name": "BaseBdev1", 00:10:59.230 "aliases": [ 00:10:59.230 "9fa55f4d-3589-4f38-bd6f-e0fdc62ce0a0" 00:10:59.230 ], 00:10:59.230 "product_name": "Malloc disk", 00:10:59.230 "block_size": 512, 00:10:59.230 "num_blocks": 65536, 00:10:59.230 "uuid": "9fa55f4d-3589-4f38-bd6f-e0fdc62ce0a0", 00:10:59.230 "assigned_rate_limits": { 00:10:59.230 "rw_ios_per_sec": 0, 00:10:59.230 "rw_mbytes_per_sec": 0, 00:10:59.230 "r_mbytes_per_sec": 0, 00:10:59.230 "w_mbytes_per_sec": 0 00:10:59.230 }, 00:10:59.230 "claimed": true, 00:10:59.230 "claim_type": "exclusive_write", 00:10:59.230 "zoned": false, 00:10:59.230 "supported_io_types": { 00:10:59.230 "read": true, 00:10:59.230 "write": true, 00:10:59.230 "unmap": true, 00:10:59.230 "flush": true, 00:10:59.230 "reset": true, 00:10:59.230 "nvme_admin": false, 00:10:59.230 "nvme_io": false, 00:10:59.230 "nvme_io_md": false, 00:10:59.230 "write_zeroes": true, 00:10:59.230 "zcopy": true, 00:10:59.230 "get_zone_info": false, 00:10:59.230 "zone_management": false, 00:10:59.230 "zone_append": false, 00:10:59.230 "compare": false, 00:10:59.230 "compare_and_write": false, 00:10:59.230 "abort": true, 00:10:59.230 "seek_hole": false, 00:10:59.230 "seek_data": false, 00:10:59.230 "copy": true, 00:10:59.230 "nvme_iov_md": false 00:10:59.230 }, 00:10:59.230 "memory_domains": [ 00:10:59.230 { 00:10:59.230 "dma_device_id": "system", 00:10:59.230 "dma_device_type": 1 00:10:59.230 }, 00:10:59.230 { 00:10:59.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:59.230 "dma_device_type": 2 00:10:59.230 } 00:10:59.230 ], 00:10:59.230 "driver_specific": {} 00:10:59.230 }' 00:10:59.230 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:59.490 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:59.490 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:59.490 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:59.490 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:59.490 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:59.490 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:59.490 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:59.490 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:59.750 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:59.750 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:59.750 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:59.750 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:59.750 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:59.750 17:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:00.010 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:00.010 "name": "BaseBdev2", 00:11:00.010 "aliases": [ 00:11:00.010 "de7da525-53a2-4bc1-b8b3-cf784ad03a44" 00:11:00.010 ], 00:11:00.010 "product_name": "Malloc disk", 00:11:00.010 "block_size": 512, 00:11:00.010 "num_blocks": 65536, 00:11:00.010 "uuid": "de7da525-53a2-4bc1-b8b3-cf784ad03a44", 00:11:00.010 "assigned_rate_limits": { 00:11:00.010 "rw_ios_per_sec": 0, 00:11:00.010 "rw_mbytes_per_sec": 0, 00:11:00.010 "r_mbytes_per_sec": 0, 00:11:00.010 "w_mbytes_per_sec": 0 00:11:00.010 }, 00:11:00.010 "claimed": true, 00:11:00.010 "claim_type": "exclusive_write", 00:11:00.010 "zoned": false, 00:11:00.010 "supported_io_types": { 00:11:00.010 "read": true, 00:11:00.010 "write": true, 00:11:00.010 "unmap": true, 00:11:00.010 "flush": true, 00:11:00.010 "reset": true, 00:11:00.010 "nvme_admin": false, 00:11:00.010 "nvme_io": false, 00:11:00.010 "nvme_io_md": false, 00:11:00.010 "write_zeroes": true, 00:11:00.010 "zcopy": true, 00:11:00.010 "get_zone_info": false, 00:11:00.010 "zone_management": false, 00:11:00.010 "zone_append": false, 00:11:00.010 "compare": false, 00:11:00.010 "compare_and_write": false, 00:11:00.010 "abort": true, 00:11:00.010 "seek_hole": false, 00:11:00.010 "seek_data": false, 00:11:00.010 "copy": true, 00:11:00.010 "nvme_iov_md": false 00:11:00.010 }, 00:11:00.010 "memory_domains": [ 00:11:00.010 { 00:11:00.010 "dma_device_id": "system", 00:11:00.010 "dma_device_type": 1 00:11:00.010 }, 00:11:00.010 { 00:11:00.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.010 "dma_device_type": 2 00:11:00.010 } 00:11:00.010 ], 00:11:00.010 "driver_specific": {} 00:11:00.010 }' 00:11:00.010 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.010 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.010 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:00.010 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:00.010 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:00.010 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:00.010 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:00.010 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:00.270 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:00.270 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:00.270 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:00.270 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:00.270 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:00.530 [2024-07-15 17:23:11.580420] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:00.530 [2024-07-15 17:23:11.580439] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:00.530 [2024-07-15 17:23:11.580469] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:00.530 "name": "Existed_Raid", 00:11:00.530 "uuid": "c8b6b68b-20e3-403c-86af-1194dc6c0e9c", 00:11:00.530 "strip_size_kb": 64, 00:11:00.530 "state": "offline", 00:11:00.530 "raid_level": "concat", 00:11:00.530 "superblock": false, 00:11:00.530 "num_base_bdevs": 2, 00:11:00.530 "num_base_bdevs_discovered": 1, 00:11:00.530 "num_base_bdevs_operational": 1, 00:11:00.530 "base_bdevs_list": [ 00:11:00.530 { 00:11:00.530 "name": null, 00:11:00.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:00.530 "is_configured": false, 00:11:00.530 "data_offset": 0, 00:11:00.530 "data_size": 65536 00:11:00.530 }, 00:11:00.530 { 00:11:00.530 "name": "BaseBdev2", 00:11:00.530 "uuid": "de7da525-53a2-4bc1-b8b3-cf784ad03a44", 00:11:00.530 "is_configured": true, 00:11:00.530 "data_offset": 0, 00:11:00.530 "data_size": 65536 00:11:00.530 } 00:11:00.530 ] 00:11:00.530 }' 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:00.530 17:23:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.100 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:01.100 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:01.100 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.100 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:01.360 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:01.360 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:01.360 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:01.620 [2024-07-15 17:23:12.719299] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:01.620 [2024-07-15 17:23:12.719337] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x917d90 name Existed_Raid, state offline 00:11:01.620 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:01.620 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:01.620 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.620 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:01.880 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:01.880 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:01.880 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:01.880 17:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2747143 00:11:01.880 17:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2747143 ']' 00:11:01.880 17:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2747143 00:11:01.880 17:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:01.880 17:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:01.880 17:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2747143 00:11:01.880 17:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:01.880 17:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:01.880 17:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2747143' 00:11:01.880 killing process with pid 2747143 00:11:01.880 17:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2747143 00:11:01.880 [2024-07-15 17:23:13.003810] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:01.880 17:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2747143 00:11:01.880 [2024-07-15 17:23:13.004407] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:01.880 17:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:01.880 00:11:01.880 real 0m9.702s 00:11:01.880 user 0m17.704s 00:11:01.880 sys 0m1.417s 00:11:01.880 17:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:01.880 17:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.880 ************************************ 00:11:01.880 END TEST raid_state_function_test 00:11:01.880 ************************************ 00:11:01.880 17:23:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:01.880 17:23:13 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:01.880 17:23:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:01.880 17:23:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:01.880 17:23:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:02.139 ************************************ 00:11:02.139 START TEST raid_state_function_test_sb 00:11:02.139 ************************************ 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2749148 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2749148' 00:11:02.139 Process raid pid: 2749148 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2749148 /var/tmp/spdk-raid.sock 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2749148 ']' 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:02.139 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:02.139 17:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:02.139 [2024-07-15 17:23:13.256552] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:11:02.139 [2024-07-15 17:23:13.256602] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:02.139 [2024-07-15 17:23:13.346455] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:02.139 [2024-07-15 17:23:13.414098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:02.398 [2024-07-15 17:23:13.463202] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:02.398 [2024-07-15 17:23:13.463229] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:02.968 17:23:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:02.968 17:23:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:02.968 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:03.228 [2024-07-15 17:23:14.271075] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:03.228 [2024-07-15 17:23:14.271103] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:03.228 [2024-07-15 17:23:14.271109] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:03.228 [2024-07-15 17:23:14.271115] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.228 "name": "Existed_Raid", 00:11:03.228 "uuid": "0f2ea027-5c37-4bb0-a457-b3da55ed3f8f", 00:11:03.228 "strip_size_kb": 64, 00:11:03.228 "state": "configuring", 00:11:03.228 "raid_level": "concat", 00:11:03.228 "superblock": true, 00:11:03.228 "num_base_bdevs": 2, 00:11:03.228 "num_base_bdevs_discovered": 0, 00:11:03.228 "num_base_bdevs_operational": 2, 00:11:03.228 "base_bdevs_list": [ 00:11:03.228 { 00:11:03.228 "name": "BaseBdev1", 00:11:03.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.228 "is_configured": false, 00:11:03.228 "data_offset": 0, 00:11:03.228 "data_size": 0 00:11:03.228 }, 00:11:03.228 { 00:11:03.228 "name": "BaseBdev2", 00:11:03.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.228 "is_configured": false, 00:11:03.228 "data_offset": 0, 00:11:03.228 "data_size": 0 00:11:03.228 } 00:11:03.228 ] 00:11:03.228 }' 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.228 17:23:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:03.797 17:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:04.056 [2024-07-15 17:23:15.237404] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:04.056 [2024-07-15 17:23:15.237422] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbfe6b0 name Existed_Raid, state configuring 00:11:04.056 17:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:04.315 [2024-07-15 17:23:15.429917] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:04.315 [2024-07-15 17:23:15.429938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:04.315 [2024-07-15 17:23:15.429943] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:04.315 [2024-07-15 17:23:15.429948] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:04.315 17:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:04.574 [2024-07-15 17:23:15.625070] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:04.574 BaseBdev1 00:11:04.574 17:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:04.574 17:23:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:04.574 17:23:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:04.574 17:23:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:04.574 17:23:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:04.574 17:23:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:04.574 17:23:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:04.574 17:23:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:04.833 [ 00:11:04.833 { 00:11:04.833 "name": "BaseBdev1", 00:11:04.833 "aliases": [ 00:11:04.833 "dfd44b5f-4d4f-4af0-925e-5fa6466280e2" 00:11:04.833 ], 00:11:04.833 "product_name": "Malloc disk", 00:11:04.833 "block_size": 512, 00:11:04.833 "num_blocks": 65536, 00:11:04.833 "uuid": "dfd44b5f-4d4f-4af0-925e-5fa6466280e2", 00:11:04.833 "assigned_rate_limits": { 00:11:04.833 "rw_ios_per_sec": 0, 00:11:04.833 "rw_mbytes_per_sec": 0, 00:11:04.833 "r_mbytes_per_sec": 0, 00:11:04.833 "w_mbytes_per_sec": 0 00:11:04.833 }, 00:11:04.833 "claimed": true, 00:11:04.833 "claim_type": "exclusive_write", 00:11:04.833 "zoned": false, 00:11:04.833 "supported_io_types": { 00:11:04.833 "read": true, 00:11:04.833 "write": true, 00:11:04.833 "unmap": true, 00:11:04.833 "flush": true, 00:11:04.833 "reset": true, 00:11:04.833 "nvme_admin": false, 00:11:04.833 "nvme_io": false, 00:11:04.833 "nvme_io_md": false, 00:11:04.833 "write_zeroes": true, 00:11:04.833 "zcopy": true, 00:11:04.833 "get_zone_info": false, 00:11:04.833 "zone_management": false, 00:11:04.833 "zone_append": false, 00:11:04.833 "compare": false, 00:11:04.833 "compare_and_write": false, 00:11:04.833 "abort": true, 00:11:04.833 "seek_hole": false, 00:11:04.833 "seek_data": false, 00:11:04.833 "copy": true, 00:11:04.833 "nvme_iov_md": false 00:11:04.833 }, 00:11:04.833 "memory_domains": [ 00:11:04.833 { 00:11:04.833 "dma_device_id": "system", 00:11:04.833 "dma_device_type": 1 00:11:04.833 }, 00:11:04.833 { 00:11:04.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.833 "dma_device_type": 2 00:11:04.833 } 00:11:04.833 ], 00:11:04.833 "driver_specific": {} 00:11:04.833 } 00:11:04.833 ] 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.833 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:05.093 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:05.093 "name": "Existed_Raid", 00:11:05.093 "uuid": "8a5bb999-25cc-498d-9956-db9a58c748f1", 00:11:05.093 "strip_size_kb": 64, 00:11:05.093 "state": "configuring", 00:11:05.093 "raid_level": "concat", 00:11:05.093 "superblock": true, 00:11:05.093 "num_base_bdevs": 2, 00:11:05.093 "num_base_bdevs_discovered": 1, 00:11:05.093 "num_base_bdevs_operational": 2, 00:11:05.093 "base_bdevs_list": [ 00:11:05.093 { 00:11:05.093 "name": "BaseBdev1", 00:11:05.093 "uuid": "dfd44b5f-4d4f-4af0-925e-5fa6466280e2", 00:11:05.093 "is_configured": true, 00:11:05.093 "data_offset": 2048, 00:11:05.093 "data_size": 63488 00:11:05.093 }, 00:11:05.093 { 00:11:05.093 "name": "BaseBdev2", 00:11:05.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:05.093 "is_configured": false, 00:11:05.093 "data_offset": 0, 00:11:05.093 "data_size": 0 00:11:05.093 } 00:11:05.093 ] 00:11:05.093 }' 00:11:05.093 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:05.093 17:23:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:05.662 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:05.662 [2024-07-15 17:23:16.948415] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:05.662 [2024-07-15 17:23:16.948439] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbfdfa0 name Existed_Raid, state configuring 00:11:05.921 17:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:05.921 [2024-07-15 17:23:17.136924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:05.921 [2024-07-15 17:23:17.138053] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:05.921 [2024-07-15 17:23:17.138077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:05.921 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:06.181 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:06.181 "name": "Existed_Raid", 00:11:06.181 "uuid": "a2d966cb-f596-4f15-9719-60fd51637df6", 00:11:06.181 "strip_size_kb": 64, 00:11:06.181 "state": "configuring", 00:11:06.181 "raid_level": "concat", 00:11:06.181 "superblock": true, 00:11:06.181 "num_base_bdevs": 2, 00:11:06.181 "num_base_bdevs_discovered": 1, 00:11:06.181 "num_base_bdevs_operational": 2, 00:11:06.181 "base_bdevs_list": [ 00:11:06.181 { 00:11:06.181 "name": "BaseBdev1", 00:11:06.181 "uuid": "dfd44b5f-4d4f-4af0-925e-5fa6466280e2", 00:11:06.181 "is_configured": true, 00:11:06.181 "data_offset": 2048, 00:11:06.181 "data_size": 63488 00:11:06.181 }, 00:11:06.181 { 00:11:06.181 "name": "BaseBdev2", 00:11:06.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:06.181 "is_configured": false, 00:11:06.181 "data_offset": 0, 00:11:06.181 "data_size": 0 00:11:06.181 } 00:11:06.181 ] 00:11:06.181 }' 00:11:06.181 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:06.181 17:23:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:06.750 17:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:07.010 [2024-07-15 17:23:18.052185] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:07.010 [2024-07-15 17:23:18.052296] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbfed90 00:11:07.010 [2024-07-15 17:23:18.052305] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:07.010 [2024-07-15 17:23:18.052443] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdb2780 00:11:07.010 [2024-07-15 17:23:18.052533] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbfed90 00:11:07.010 [2024-07-15 17:23:18.052538] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbfed90 00:11:07.010 [2024-07-15 17:23:18.052604] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:07.010 BaseBdev2 00:11:07.010 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:07.010 17:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:07.010 17:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:07.010 17:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:07.010 17:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:07.010 17:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:07.010 17:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:07.010 17:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:07.269 [ 00:11:07.269 { 00:11:07.269 "name": "BaseBdev2", 00:11:07.269 "aliases": [ 00:11:07.269 "f3ed3a47-d5d8-4342-a0ab-f59a263ea54a" 00:11:07.269 ], 00:11:07.269 "product_name": "Malloc disk", 00:11:07.269 "block_size": 512, 00:11:07.269 "num_blocks": 65536, 00:11:07.269 "uuid": "f3ed3a47-d5d8-4342-a0ab-f59a263ea54a", 00:11:07.269 "assigned_rate_limits": { 00:11:07.269 "rw_ios_per_sec": 0, 00:11:07.269 "rw_mbytes_per_sec": 0, 00:11:07.269 "r_mbytes_per_sec": 0, 00:11:07.269 "w_mbytes_per_sec": 0 00:11:07.269 }, 00:11:07.269 "claimed": true, 00:11:07.269 "claim_type": "exclusive_write", 00:11:07.269 "zoned": false, 00:11:07.269 "supported_io_types": { 00:11:07.269 "read": true, 00:11:07.269 "write": true, 00:11:07.269 "unmap": true, 00:11:07.269 "flush": true, 00:11:07.269 "reset": true, 00:11:07.269 "nvme_admin": false, 00:11:07.269 "nvme_io": false, 00:11:07.269 "nvme_io_md": false, 00:11:07.269 "write_zeroes": true, 00:11:07.269 "zcopy": true, 00:11:07.269 "get_zone_info": false, 00:11:07.269 "zone_management": false, 00:11:07.269 "zone_append": false, 00:11:07.269 "compare": false, 00:11:07.269 "compare_and_write": false, 00:11:07.269 "abort": true, 00:11:07.269 "seek_hole": false, 00:11:07.269 "seek_data": false, 00:11:07.269 "copy": true, 00:11:07.269 "nvme_iov_md": false 00:11:07.269 }, 00:11:07.269 "memory_domains": [ 00:11:07.269 { 00:11:07.269 "dma_device_id": "system", 00:11:07.269 "dma_device_type": 1 00:11:07.269 }, 00:11:07.269 { 00:11:07.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.269 "dma_device_type": 2 00:11:07.269 } 00:11:07.269 ], 00:11:07.269 "driver_specific": {} 00:11:07.269 } 00:11:07.269 ] 00:11:07.269 17:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:07.269 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.270 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:07.528 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:07.528 "name": "Existed_Raid", 00:11:07.528 "uuid": "a2d966cb-f596-4f15-9719-60fd51637df6", 00:11:07.528 "strip_size_kb": 64, 00:11:07.528 "state": "online", 00:11:07.528 "raid_level": "concat", 00:11:07.528 "superblock": true, 00:11:07.528 "num_base_bdevs": 2, 00:11:07.528 "num_base_bdevs_discovered": 2, 00:11:07.528 "num_base_bdevs_operational": 2, 00:11:07.528 "base_bdevs_list": [ 00:11:07.528 { 00:11:07.528 "name": "BaseBdev1", 00:11:07.528 "uuid": "dfd44b5f-4d4f-4af0-925e-5fa6466280e2", 00:11:07.528 "is_configured": true, 00:11:07.528 "data_offset": 2048, 00:11:07.528 "data_size": 63488 00:11:07.529 }, 00:11:07.529 { 00:11:07.529 "name": "BaseBdev2", 00:11:07.529 "uuid": "f3ed3a47-d5d8-4342-a0ab-f59a263ea54a", 00:11:07.529 "is_configured": true, 00:11:07.529 "data_offset": 2048, 00:11:07.529 "data_size": 63488 00:11:07.529 } 00:11:07.529 ] 00:11:07.529 }' 00:11:07.529 17:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:07.529 17:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:08.097 [2024-07-15 17:23:19.355696] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:08.097 "name": "Existed_Raid", 00:11:08.097 "aliases": [ 00:11:08.097 "a2d966cb-f596-4f15-9719-60fd51637df6" 00:11:08.097 ], 00:11:08.097 "product_name": "Raid Volume", 00:11:08.097 "block_size": 512, 00:11:08.097 "num_blocks": 126976, 00:11:08.097 "uuid": "a2d966cb-f596-4f15-9719-60fd51637df6", 00:11:08.097 "assigned_rate_limits": { 00:11:08.097 "rw_ios_per_sec": 0, 00:11:08.097 "rw_mbytes_per_sec": 0, 00:11:08.097 "r_mbytes_per_sec": 0, 00:11:08.097 "w_mbytes_per_sec": 0 00:11:08.097 }, 00:11:08.097 "claimed": false, 00:11:08.097 "zoned": false, 00:11:08.097 "supported_io_types": { 00:11:08.097 "read": true, 00:11:08.097 "write": true, 00:11:08.097 "unmap": true, 00:11:08.097 "flush": true, 00:11:08.097 "reset": true, 00:11:08.097 "nvme_admin": false, 00:11:08.097 "nvme_io": false, 00:11:08.097 "nvme_io_md": false, 00:11:08.097 "write_zeroes": true, 00:11:08.097 "zcopy": false, 00:11:08.097 "get_zone_info": false, 00:11:08.097 "zone_management": false, 00:11:08.097 "zone_append": false, 00:11:08.097 "compare": false, 00:11:08.097 "compare_and_write": false, 00:11:08.097 "abort": false, 00:11:08.097 "seek_hole": false, 00:11:08.097 "seek_data": false, 00:11:08.097 "copy": false, 00:11:08.097 "nvme_iov_md": false 00:11:08.097 }, 00:11:08.097 "memory_domains": [ 00:11:08.097 { 00:11:08.097 "dma_device_id": "system", 00:11:08.097 "dma_device_type": 1 00:11:08.097 }, 00:11:08.097 { 00:11:08.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.097 "dma_device_type": 2 00:11:08.097 }, 00:11:08.097 { 00:11:08.097 "dma_device_id": "system", 00:11:08.097 "dma_device_type": 1 00:11:08.097 }, 00:11:08.097 { 00:11:08.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.097 "dma_device_type": 2 00:11:08.097 } 00:11:08.097 ], 00:11:08.097 "driver_specific": { 00:11:08.097 "raid": { 00:11:08.097 "uuid": "a2d966cb-f596-4f15-9719-60fd51637df6", 00:11:08.097 "strip_size_kb": 64, 00:11:08.097 "state": "online", 00:11:08.097 "raid_level": "concat", 00:11:08.097 "superblock": true, 00:11:08.097 "num_base_bdevs": 2, 00:11:08.097 "num_base_bdevs_discovered": 2, 00:11:08.097 "num_base_bdevs_operational": 2, 00:11:08.097 "base_bdevs_list": [ 00:11:08.097 { 00:11:08.097 "name": "BaseBdev1", 00:11:08.097 "uuid": "dfd44b5f-4d4f-4af0-925e-5fa6466280e2", 00:11:08.097 "is_configured": true, 00:11:08.097 "data_offset": 2048, 00:11:08.097 "data_size": 63488 00:11:08.097 }, 00:11:08.097 { 00:11:08.097 "name": "BaseBdev2", 00:11:08.097 "uuid": "f3ed3a47-d5d8-4342-a0ab-f59a263ea54a", 00:11:08.097 "is_configured": true, 00:11:08.097 "data_offset": 2048, 00:11:08.097 "data_size": 63488 00:11:08.097 } 00:11:08.097 ] 00:11:08.097 } 00:11:08.097 } 00:11:08.097 }' 00:11:08.097 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:08.356 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:08.356 BaseBdev2' 00:11:08.356 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:08.356 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:08.356 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:08.356 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:08.356 "name": "BaseBdev1", 00:11:08.356 "aliases": [ 00:11:08.356 "dfd44b5f-4d4f-4af0-925e-5fa6466280e2" 00:11:08.356 ], 00:11:08.356 "product_name": "Malloc disk", 00:11:08.356 "block_size": 512, 00:11:08.356 "num_blocks": 65536, 00:11:08.356 "uuid": "dfd44b5f-4d4f-4af0-925e-5fa6466280e2", 00:11:08.356 "assigned_rate_limits": { 00:11:08.356 "rw_ios_per_sec": 0, 00:11:08.356 "rw_mbytes_per_sec": 0, 00:11:08.356 "r_mbytes_per_sec": 0, 00:11:08.356 "w_mbytes_per_sec": 0 00:11:08.356 }, 00:11:08.356 "claimed": true, 00:11:08.356 "claim_type": "exclusive_write", 00:11:08.356 "zoned": false, 00:11:08.356 "supported_io_types": { 00:11:08.356 "read": true, 00:11:08.356 "write": true, 00:11:08.356 "unmap": true, 00:11:08.356 "flush": true, 00:11:08.356 "reset": true, 00:11:08.356 "nvme_admin": false, 00:11:08.356 "nvme_io": false, 00:11:08.356 "nvme_io_md": false, 00:11:08.356 "write_zeroes": true, 00:11:08.356 "zcopy": true, 00:11:08.356 "get_zone_info": false, 00:11:08.356 "zone_management": false, 00:11:08.356 "zone_append": false, 00:11:08.356 "compare": false, 00:11:08.356 "compare_and_write": false, 00:11:08.356 "abort": true, 00:11:08.356 "seek_hole": false, 00:11:08.356 "seek_data": false, 00:11:08.356 "copy": true, 00:11:08.356 "nvme_iov_md": false 00:11:08.356 }, 00:11:08.356 "memory_domains": [ 00:11:08.356 { 00:11:08.356 "dma_device_id": "system", 00:11:08.356 "dma_device_type": 1 00:11:08.356 }, 00:11:08.356 { 00:11:08.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.356 "dma_device_type": 2 00:11:08.356 } 00:11:08.356 ], 00:11:08.356 "driver_specific": {} 00:11:08.356 }' 00:11:08.356 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:08.356 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:08.617 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:08.617 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:08.617 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:08.617 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:08.617 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:08.617 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:08.617 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:08.617 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:08.876 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:08.876 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:08.876 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:08.876 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:08.876 17:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:08.876 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:08.876 "name": "BaseBdev2", 00:11:08.876 "aliases": [ 00:11:08.876 "f3ed3a47-d5d8-4342-a0ab-f59a263ea54a" 00:11:08.876 ], 00:11:08.876 "product_name": "Malloc disk", 00:11:08.876 "block_size": 512, 00:11:08.876 "num_blocks": 65536, 00:11:08.876 "uuid": "f3ed3a47-d5d8-4342-a0ab-f59a263ea54a", 00:11:08.876 "assigned_rate_limits": { 00:11:08.876 "rw_ios_per_sec": 0, 00:11:08.876 "rw_mbytes_per_sec": 0, 00:11:08.876 "r_mbytes_per_sec": 0, 00:11:08.876 "w_mbytes_per_sec": 0 00:11:08.876 }, 00:11:08.876 "claimed": true, 00:11:08.876 "claim_type": "exclusive_write", 00:11:08.876 "zoned": false, 00:11:08.876 "supported_io_types": { 00:11:08.876 "read": true, 00:11:08.876 "write": true, 00:11:08.876 "unmap": true, 00:11:08.876 "flush": true, 00:11:08.876 "reset": true, 00:11:08.876 "nvme_admin": false, 00:11:08.876 "nvme_io": false, 00:11:08.876 "nvme_io_md": false, 00:11:08.876 "write_zeroes": true, 00:11:08.876 "zcopy": true, 00:11:08.876 "get_zone_info": false, 00:11:08.876 "zone_management": false, 00:11:08.876 "zone_append": false, 00:11:08.876 "compare": false, 00:11:08.876 "compare_and_write": false, 00:11:08.876 "abort": true, 00:11:08.876 "seek_hole": false, 00:11:08.876 "seek_data": false, 00:11:08.876 "copy": true, 00:11:08.876 "nvme_iov_md": false 00:11:08.876 }, 00:11:08.876 "memory_domains": [ 00:11:08.876 { 00:11:08.876 "dma_device_id": "system", 00:11:08.876 "dma_device_type": 1 00:11:08.876 }, 00:11:08.876 { 00:11:08.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.876 "dma_device_type": 2 00:11:08.876 } 00:11:08.876 ], 00:11:08.876 "driver_specific": {} 00:11:08.876 }' 00:11:08.876 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:09.136 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:09.136 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:09.136 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:09.136 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:09.136 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:09.136 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:09.136 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:09.136 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:09.136 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:09.395 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:09.395 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:09.395 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:09.395 [2024-07-15 17:23:20.690963] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:09.395 [2024-07-15 17:23:20.690980] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:09.395 [2024-07-15 17:23:20.691009] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:09.655 "name": "Existed_Raid", 00:11:09.655 "uuid": "a2d966cb-f596-4f15-9719-60fd51637df6", 00:11:09.655 "strip_size_kb": 64, 00:11:09.655 "state": "offline", 00:11:09.655 "raid_level": "concat", 00:11:09.655 "superblock": true, 00:11:09.655 "num_base_bdevs": 2, 00:11:09.655 "num_base_bdevs_discovered": 1, 00:11:09.655 "num_base_bdevs_operational": 1, 00:11:09.655 "base_bdevs_list": [ 00:11:09.655 { 00:11:09.655 "name": null, 00:11:09.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.655 "is_configured": false, 00:11:09.655 "data_offset": 2048, 00:11:09.655 "data_size": 63488 00:11:09.655 }, 00:11:09.655 { 00:11:09.655 "name": "BaseBdev2", 00:11:09.655 "uuid": "f3ed3a47-d5d8-4342-a0ab-f59a263ea54a", 00:11:09.655 "is_configured": true, 00:11:09.655 "data_offset": 2048, 00:11:09.655 "data_size": 63488 00:11:09.655 } 00:11:09.655 ] 00:11:09.655 }' 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:09.655 17:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:10.222 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:10.222 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:10.222 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.222 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:10.481 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:10.481 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:10.481 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:10.739 [2024-07-15 17:23:21.789759] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:10.739 [2024-07-15 17:23:21.789793] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbfed90 name Existed_Raid, state offline 00:11:10.739 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:10.739 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:10.739 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.739 17:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:10.739 17:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:10.739 17:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:10.739 17:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:10.739 17:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2749148 00:11:10.739 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2749148 ']' 00:11:10.739 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2749148 00:11:10.739 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:10.739 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:10.739 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2749148 00:11:10.999 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:10.999 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:10.999 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2749148' 00:11:10.999 killing process with pid 2749148 00:11:10.999 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2749148 00:11:10.999 [2024-07-15 17:23:22.056612] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:10.999 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2749148 00:11:10.999 [2024-07-15 17:23:22.057210] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:10.999 17:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:10.999 00:11:10.999 real 0m8.979s 00:11:10.999 user 0m16.326s 00:11:10.999 sys 0m1.379s 00:11:10.999 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:10.999 17:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:10.999 ************************************ 00:11:10.999 END TEST raid_state_function_test_sb 00:11:10.999 ************************************ 00:11:10.999 17:23:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:10.999 17:23:22 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:10.999 17:23:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:10.999 17:23:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:10.999 17:23:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:10.999 ************************************ 00:11:10.999 START TEST raid_superblock_test 00:11:10.999 ************************************ 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2750900 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2750900 /var/tmp/spdk-raid.sock 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2750900 ']' 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:10.999 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:10.999 17:23:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.257 [2024-07-15 17:23:22.306300] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:11:11.257 [2024-07-15 17:23:22.306349] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2750900 ] 00:11:11.257 [2024-07-15 17:23:22.394307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:11.257 [2024-07-15 17:23:22.458569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:11.257 [2024-07-15 17:23:22.497673] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.257 [2024-07-15 17:23:22.497695] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:12.192 malloc1 00:11:12.192 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:12.451 [2024-07-15 17:23:23.513724] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:12.451 [2024-07-15 17:23:23.513758] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:12.451 [2024-07-15 17:23:23.513769] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x165fa20 00:11:12.451 [2024-07-15 17:23:23.513776] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:12.451 [2024-07-15 17:23:23.515074] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:12.451 [2024-07-15 17:23:23.515094] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:12.451 pt1 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:12.451 malloc2 00:11:12.451 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:12.709 [2024-07-15 17:23:23.896819] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:12.709 [2024-07-15 17:23:23.896847] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:12.709 [2024-07-15 17:23:23.896859] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1660040 00:11:12.709 [2024-07-15 17:23:23.896865] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:12.709 [2024-07-15 17:23:23.898048] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:12.709 [2024-07-15 17:23:23.898067] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:12.709 pt2 00:11:12.709 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:12.709 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:12.709 17:23:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:13.028 [2024-07-15 17:23:24.089324] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:13.028 [2024-07-15 17:23:24.090343] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:13.028 [2024-07-15 17:23:24.090450] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x180c3d0 00:11:13.028 [2024-07-15 17:23:24.090458] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:13.028 [2024-07-15 17:23:24.090609] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x180b7f0 00:11:13.028 [2024-07-15 17:23:24.090721] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x180c3d0 00:11:13.028 [2024-07-15 17:23:24.090727] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x180c3d0 00:11:13.028 [2024-07-15 17:23:24.090795] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.028 "name": "raid_bdev1", 00:11:13.028 "uuid": "ee41fc46-88b3-4e67-8ce0-097179d11c34", 00:11:13.028 "strip_size_kb": 64, 00:11:13.028 "state": "online", 00:11:13.028 "raid_level": "concat", 00:11:13.028 "superblock": true, 00:11:13.028 "num_base_bdevs": 2, 00:11:13.028 "num_base_bdevs_discovered": 2, 00:11:13.028 "num_base_bdevs_operational": 2, 00:11:13.028 "base_bdevs_list": [ 00:11:13.028 { 00:11:13.028 "name": "pt1", 00:11:13.028 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:13.028 "is_configured": true, 00:11:13.028 "data_offset": 2048, 00:11:13.028 "data_size": 63488 00:11:13.028 }, 00:11:13.028 { 00:11:13.028 "name": "pt2", 00:11:13.028 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:13.028 "is_configured": true, 00:11:13.028 "data_offset": 2048, 00:11:13.028 "data_size": 63488 00:11:13.028 } 00:11:13.028 ] 00:11:13.028 }' 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.028 17:23:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:13.593 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:13.593 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:13.593 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:13.593 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:13.593 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:13.593 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:13.593 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:13.593 17:23:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:13.852 [2024-07-15 17:23:25.035904] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:13.852 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:13.852 "name": "raid_bdev1", 00:11:13.852 "aliases": [ 00:11:13.852 "ee41fc46-88b3-4e67-8ce0-097179d11c34" 00:11:13.852 ], 00:11:13.852 "product_name": "Raid Volume", 00:11:13.852 "block_size": 512, 00:11:13.852 "num_blocks": 126976, 00:11:13.852 "uuid": "ee41fc46-88b3-4e67-8ce0-097179d11c34", 00:11:13.852 "assigned_rate_limits": { 00:11:13.852 "rw_ios_per_sec": 0, 00:11:13.852 "rw_mbytes_per_sec": 0, 00:11:13.852 "r_mbytes_per_sec": 0, 00:11:13.852 "w_mbytes_per_sec": 0 00:11:13.852 }, 00:11:13.852 "claimed": false, 00:11:13.852 "zoned": false, 00:11:13.852 "supported_io_types": { 00:11:13.852 "read": true, 00:11:13.852 "write": true, 00:11:13.852 "unmap": true, 00:11:13.852 "flush": true, 00:11:13.852 "reset": true, 00:11:13.852 "nvme_admin": false, 00:11:13.852 "nvme_io": false, 00:11:13.852 "nvme_io_md": false, 00:11:13.852 "write_zeroes": true, 00:11:13.852 "zcopy": false, 00:11:13.852 "get_zone_info": false, 00:11:13.852 "zone_management": false, 00:11:13.852 "zone_append": false, 00:11:13.852 "compare": false, 00:11:13.852 "compare_and_write": false, 00:11:13.852 "abort": false, 00:11:13.852 "seek_hole": false, 00:11:13.852 "seek_data": false, 00:11:13.852 "copy": false, 00:11:13.852 "nvme_iov_md": false 00:11:13.852 }, 00:11:13.852 "memory_domains": [ 00:11:13.852 { 00:11:13.852 "dma_device_id": "system", 00:11:13.852 "dma_device_type": 1 00:11:13.852 }, 00:11:13.852 { 00:11:13.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.852 "dma_device_type": 2 00:11:13.852 }, 00:11:13.852 { 00:11:13.852 "dma_device_id": "system", 00:11:13.852 "dma_device_type": 1 00:11:13.852 }, 00:11:13.852 { 00:11:13.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.852 "dma_device_type": 2 00:11:13.852 } 00:11:13.852 ], 00:11:13.852 "driver_specific": { 00:11:13.852 "raid": { 00:11:13.852 "uuid": "ee41fc46-88b3-4e67-8ce0-097179d11c34", 00:11:13.852 "strip_size_kb": 64, 00:11:13.852 "state": "online", 00:11:13.852 "raid_level": "concat", 00:11:13.852 "superblock": true, 00:11:13.852 "num_base_bdevs": 2, 00:11:13.852 "num_base_bdevs_discovered": 2, 00:11:13.852 "num_base_bdevs_operational": 2, 00:11:13.852 "base_bdevs_list": [ 00:11:13.852 { 00:11:13.852 "name": "pt1", 00:11:13.852 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:13.852 "is_configured": true, 00:11:13.852 "data_offset": 2048, 00:11:13.852 "data_size": 63488 00:11:13.852 }, 00:11:13.852 { 00:11:13.852 "name": "pt2", 00:11:13.852 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:13.852 "is_configured": true, 00:11:13.852 "data_offset": 2048, 00:11:13.852 "data_size": 63488 00:11:13.852 } 00:11:13.852 ] 00:11:13.852 } 00:11:13.852 } 00:11:13.852 }' 00:11:13.852 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:13.852 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:13.852 pt2' 00:11:13.852 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:13.852 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:13.852 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:14.111 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:14.111 "name": "pt1", 00:11:14.111 "aliases": [ 00:11:14.111 "00000000-0000-0000-0000-000000000001" 00:11:14.111 ], 00:11:14.111 "product_name": "passthru", 00:11:14.111 "block_size": 512, 00:11:14.111 "num_blocks": 65536, 00:11:14.111 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:14.111 "assigned_rate_limits": { 00:11:14.111 "rw_ios_per_sec": 0, 00:11:14.111 "rw_mbytes_per_sec": 0, 00:11:14.111 "r_mbytes_per_sec": 0, 00:11:14.111 "w_mbytes_per_sec": 0 00:11:14.111 }, 00:11:14.111 "claimed": true, 00:11:14.111 "claim_type": "exclusive_write", 00:11:14.111 "zoned": false, 00:11:14.111 "supported_io_types": { 00:11:14.111 "read": true, 00:11:14.111 "write": true, 00:11:14.111 "unmap": true, 00:11:14.111 "flush": true, 00:11:14.111 "reset": true, 00:11:14.111 "nvme_admin": false, 00:11:14.111 "nvme_io": false, 00:11:14.111 "nvme_io_md": false, 00:11:14.111 "write_zeroes": true, 00:11:14.111 "zcopy": true, 00:11:14.111 "get_zone_info": false, 00:11:14.111 "zone_management": false, 00:11:14.111 "zone_append": false, 00:11:14.111 "compare": false, 00:11:14.111 "compare_and_write": false, 00:11:14.111 "abort": true, 00:11:14.111 "seek_hole": false, 00:11:14.111 "seek_data": false, 00:11:14.111 "copy": true, 00:11:14.111 "nvme_iov_md": false 00:11:14.111 }, 00:11:14.111 "memory_domains": [ 00:11:14.111 { 00:11:14.111 "dma_device_id": "system", 00:11:14.111 "dma_device_type": 1 00:11:14.111 }, 00:11:14.111 { 00:11:14.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.111 "dma_device_type": 2 00:11:14.112 } 00:11:14.112 ], 00:11:14.112 "driver_specific": { 00:11:14.112 "passthru": { 00:11:14.112 "name": "pt1", 00:11:14.112 "base_bdev_name": "malloc1" 00:11:14.112 } 00:11:14.112 } 00:11:14.112 }' 00:11:14.112 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.112 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.112 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:14.112 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:14.371 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:14.631 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:14.631 "name": "pt2", 00:11:14.631 "aliases": [ 00:11:14.631 "00000000-0000-0000-0000-000000000002" 00:11:14.631 ], 00:11:14.631 "product_name": "passthru", 00:11:14.631 "block_size": 512, 00:11:14.631 "num_blocks": 65536, 00:11:14.631 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:14.631 "assigned_rate_limits": { 00:11:14.631 "rw_ios_per_sec": 0, 00:11:14.631 "rw_mbytes_per_sec": 0, 00:11:14.631 "r_mbytes_per_sec": 0, 00:11:14.631 "w_mbytes_per_sec": 0 00:11:14.631 }, 00:11:14.631 "claimed": true, 00:11:14.631 "claim_type": "exclusive_write", 00:11:14.631 "zoned": false, 00:11:14.631 "supported_io_types": { 00:11:14.631 "read": true, 00:11:14.631 "write": true, 00:11:14.631 "unmap": true, 00:11:14.631 "flush": true, 00:11:14.631 "reset": true, 00:11:14.631 "nvme_admin": false, 00:11:14.631 "nvme_io": false, 00:11:14.631 "nvme_io_md": false, 00:11:14.631 "write_zeroes": true, 00:11:14.631 "zcopy": true, 00:11:14.631 "get_zone_info": false, 00:11:14.631 "zone_management": false, 00:11:14.631 "zone_append": false, 00:11:14.631 "compare": false, 00:11:14.631 "compare_and_write": false, 00:11:14.631 "abort": true, 00:11:14.631 "seek_hole": false, 00:11:14.631 "seek_data": false, 00:11:14.631 "copy": true, 00:11:14.631 "nvme_iov_md": false 00:11:14.631 }, 00:11:14.631 "memory_domains": [ 00:11:14.631 { 00:11:14.631 "dma_device_id": "system", 00:11:14.631 "dma_device_type": 1 00:11:14.631 }, 00:11:14.631 { 00:11:14.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.631 "dma_device_type": 2 00:11:14.631 } 00:11:14.631 ], 00:11:14.631 "driver_specific": { 00:11:14.631 "passthru": { 00:11:14.631 "name": "pt2", 00:11:14.631 "base_bdev_name": "malloc2" 00:11:14.631 } 00:11:14.631 } 00:11:14.631 }' 00:11:14.631 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.631 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.631 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:14.631 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.890 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.890 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:14.890 17:23:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.890 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.890 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:14.890 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.890 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.890 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:14.890 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:14.890 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:15.150 [2024-07-15 17:23:26.323149] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:15.150 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ee41fc46-88b3-4e67-8ce0-097179d11c34 00:11:15.150 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z ee41fc46-88b3-4e67-8ce0-097179d11c34 ']' 00:11:15.150 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:15.411 [2024-07-15 17:23:26.515437] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:15.411 [2024-07-15 17:23:26.515448] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:15.411 [2024-07-15 17:23:26.515485] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:15.411 [2024-07-15 17:23:26.515519] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:15.411 [2024-07-15 17:23:26.515525] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180c3d0 name raid_bdev1, state offline 00:11:15.411 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.411 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:15.672 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:15.672 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:15.672 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:15.672 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:15.672 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:15.672 17:23:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:15.932 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:15.932 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:16.192 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:16.192 [2024-07-15 17:23:27.477834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:16.192 [2024-07-15 17:23:27.478890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:16.192 [2024-07-15 17:23:27.478930] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:16.192 [2024-07-15 17:23:27.478956] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:16.192 [2024-07-15 17:23:27.478966] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:16.192 [2024-07-15 17:23:27.478971] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x165f070 name raid_bdev1, state configuring 00:11:16.192 request: 00:11:16.192 { 00:11:16.192 "name": "raid_bdev1", 00:11:16.192 "raid_level": "concat", 00:11:16.192 "base_bdevs": [ 00:11:16.192 "malloc1", 00:11:16.192 "malloc2" 00:11:16.192 ], 00:11:16.192 "strip_size_kb": 64, 00:11:16.192 "superblock": false, 00:11:16.192 "method": "bdev_raid_create", 00:11:16.192 "req_id": 1 00:11:16.192 } 00:11:16.192 Got JSON-RPC error response 00:11:16.192 response: 00:11:16.192 { 00:11:16.192 "code": -17, 00:11:16.192 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:16.192 } 00:11:16.451 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:16.451 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:16.451 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:16.451 17:23:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:16.451 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.451 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:16.451 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:16.451 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:16.451 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:16.743 [2024-07-15 17:23:27.850734] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:16.743 [2024-07-15 17:23:27.850756] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:16.743 [2024-07-15 17:23:27.850765] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1660e00 00:11:16.743 [2024-07-15 17:23:27.850771] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:16.743 [2024-07-15 17:23:27.852005] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:16.743 [2024-07-15 17:23:27.852025] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:16.743 [2024-07-15 17:23:27.852065] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:16.743 [2024-07-15 17:23:27.852082] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:16.743 pt1 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.743 17:23:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:17.003 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.003 "name": "raid_bdev1", 00:11:17.003 "uuid": "ee41fc46-88b3-4e67-8ce0-097179d11c34", 00:11:17.003 "strip_size_kb": 64, 00:11:17.003 "state": "configuring", 00:11:17.004 "raid_level": "concat", 00:11:17.004 "superblock": true, 00:11:17.004 "num_base_bdevs": 2, 00:11:17.004 "num_base_bdevs_discovered": 1, 00:11:17.004 "num_base_bdevs_operational": 2, 00:11:17.004 "base_bdevs_list": [ 00:11:17.004 { 00:11:17.004 "name": "pt1", 00:11:17.004 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:17.004 "is_configured": true, 00:11:17.004 "data_offset": 2048, 00:11:17.004 "data_size": 63488 00:11:17.004 }, 00:11:17.004 { 00:11:17.004 "name": null, 00:11:17.004 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:17.004 "is_configured": false, 00:11:17.004 "data_offset": 2048, 00:11:17.004 "data_size": 63488 00:11:17.004 } 00:11:17.004 ] 00:11:17.004 }' 00:11:17.004 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.004 17:23:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:17.575 [2024-07-15 17:23:28.757134] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:17.575 [2024-07-15 17:23:28.757165] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:17.575 [2024-07-15 17:23:28.757180] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16603e0 00:11:17.575 [2024-07-15 17:23:28.757187] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:17.575 [2024-07-15 17:23:28.757442] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:17.575 [2024-07-15 17:23:28.757454] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:17.575 [2024-07-15 17:23:28.757493] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:17.575 [2024-07-15 17:23:28.757506] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:17.575 [2024-07-15 17:23:28.757576] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x165e680 00:11:17.575 [2024-07-15 17:23:28.757583] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:17.575 [2024-07-15 17:23:28.757722] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x180b7f0 00:11:17.575 [2024-07-15 17:23:28.757818] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x165e680 00:11:17.575 [2024-07-15 17:23:28.757823] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x165e680 00:11:17.575 [2024-07-15 17:23:28.757894] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:17.575 pt2 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.575 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.576 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:17.836 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.836 "name": "raid_bdev1", 00:11:17.836 "uuid": "ee41fc46-88b3-4e67-8ce0-097179d11c34", 00:11:17.836 "strip_size_kb": 64, 00:11:17.836 "state": "online", 00:11:17.836 "raid_level": "concat", 00:11:17.836 "superblock": true, 00:11:17.836 "num_base_bdevs": 2, 00:11:17.836 "num_base_bdevs_discovered": 2, 00:11:17.836 "num_base_bdevs_operational": 2, 00:11:17.836 "base_bdevs_list": [ 00:11:17.836 { 00:11:17.836 "name": "pt1", 00:11:17.836 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:17.836 "is_configured": true, 00:11:17.836 "data_offset": 2048, 00:11:17.836 "data_size": 63488 00:11:17.836 }, 00:11:17.836 { 00:11:17.836 "name": "pt2", 00:11:17.836 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:17.836 "is_configured": true, 00:11:17.836 "data_offset": 2048, 00:11:17.836 "data_size": 63488 00:11:17.836 } 00:11:17.836 ] 00:11:17.836 }' 00:11:17.836 17:23:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.836 17:23:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:18.407 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:18.407 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:18.407 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:18.407 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:18.407 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:18.407 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:18.407 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:18.407 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:18.407 [2024-07-15 17:23:29.695707] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:18.667 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:18.667 "name": "raid_bdev1", 00:11:18.667 "aliases": [ 00:11:18.667 "ee41fc46-88b3-4e67-8ce0-097179d11c34" 00:11:18.667 ], 00:11:18.667 "product_name": "Raid Volume", 00:11:18.667 "block_size": 512, 00:11:18.667 "num_blocks": 126976, 00:11:18.667 "uuid": "ee41fc46-88b3-4e67-8ce0-097179d11c34", 00:11:18.667 "assigned_rate_limits": { 00:11:18.667 "rw_ios_per_sec": 0, 00:11:18.667 "rw_mbytes_per_sec": 0, 00:11:18.667 "r_mbytes_per_sec": 0, 00:11:18.667 "w_mbytes_per_sec": 0 00:11:18.667 }, 00:11:18.667 "claimed": false, 00:11:18.667 "zoned": false, 00:11:18.667 "supported_io_types": { 00:11:18.667 "read": true, 00:11:18.667 "write": true, 00:11:18.667 "unmap": true, 00:11:18.667 "flush": true, 00:11:18.667 "reset": true, 00:11:18.667 "nvme_admin": false, 00:11:18.667 "nvme_io": false, 00:11:18.667 "nvme_io_md": false, 00:11:18.667 "write_zeroes": true, 00:11:18.667 "zcopy": false, 00:11:18.667 "get_zone_info": false, 00:11:18.667 "zone_management": false, 00:11:18.667 "zone_append": false, 00:11:18.667 "compare": false, 00:11:18.667 "compare_and_write": false, 00:11:18.667 "abort": false, 00:11:18.667 "seek_hole": false, 00:11:18.667 "seek_data": false, 00:11:18.667 "copy": false, 00:11:18.667 "nvme_iov_md": false 00:11:18.667 }, 00:11:18.667 "memory_domains": [ 00:11:18.667 { 00:11:18.667 "dma_device_id": "system", 00:11:18.667 "dma_device_type": 1 00:11:18.667 }, 00:11:18.667 { 00:11:18.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.667 "dma_device_type": 2 00:11:18.667 }, 00:11:18.667 { 00:11:18.667 "dma_device_id": "system", 00:11:18.667 "dma_device_type": 1 00:11:18.667 }, 00:11:18.667 { 00:11:18.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.667 "dma_device_type": 2 00:11:18.667 } 00:11:18.667 ], 00:11:18.667 "driver_specific": { 00:11:18.667 "raid": { 00:11:18.667 "uuid": "ee41fc46-88b3-4e67-8ce0-097179d11c34", 00:11:18.667 "strip_size_kb": 64, 00:11:18.667 "state": "online", 00:11:18.667 "raid_level": "concat", 00:11:18.667 "superblock": true, 00:11:18.667 "num_base_bdevs": 2, 00:11:18.667 "num_base_bdevs_discovered": 2, 00:11:18.667 "num_base_bdevs_operational": 2, 00:11:18.667 "base_bdevs_list": [ 00:11:18.667 { 00:11:18.667 "name": "pt1", 00:11:18.667 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:18.667 "is_configured": true, 00:11:18.667 "data_offset": 2048, 00:11:18.667 "data_size": 63488 00:11:18.667 }, 00:11:18.667 { 00:11:18.667 "name": "pt2", 00:11:18.667 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:18.667 "is_configured": true, 00:11:18.667 "data_offset": 2048, 00:11:18.667 "data_size": 63488 00:11:18.667 } 00:11:18.667 ] 00:11:18.667 } 00:11:18.667 } 00:11:18.667 }' 00:11:18.667 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:18.667 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:18.667 pt2' 00:11:18.667 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:18.667 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:18.667 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:18.667 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:18.667 "name": "pt1", 00:11:18.667 "aliases": [ 00:11:18.667 "00000000-0000-0000-0000-000000000001" 00:11:18.667 ], 00:11:18.667 "product_name": "passthru", 00:11:18.667 "block_size": 512, 00:11:18.667 "num_blocks": 65536, 00:11:18.667 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:18.667 "assigned_rate_limits": { 00:11:18.667 "rw_ios_per_sec": 0, 00:11:18.667 "rw_mbytes_per_sec": 0, 00:11:18.667 "r_mbytes_per_sec": 0, 00:11:18.667 "w_mbytes_per_sec": 0 00:11:18.667 }, 00:11:18.667 "claimed": true, 00:11:18.667 "claim_type": "exclusive_write", 00:11:18.667 "zoned": false, 00:11:18.667 "supported_io_types": { 00:11:18.667 "read": true, 00:11:18.667 "write": true, 00:11:18.667 "unmap": true, 00:11:18.667 "flush": true, 00:11:18.667 "reset": true, 00:11:18.667 "nvme_admin": false, 00:11:18.667 "nvme_io": false, 00:11:18.667 "nvme_io_md": false, 00:11:18.667 "write_zeroes": true, 00:11:18.667 "zcopy": true, 00:11:18.667 "get_zone_info": false, 00:11:18.667 "zone_management": false, 00:11:18.667 "zone_append": false, 00:11:18.667 "compare": false, 00:11:18.667 "compare_and_write": false, 00:11:18.667 "abort": true, 00:11:18.667 "seek_hole": false, 00:11:18.667 "seek_data": false, 00:11:18.667 "copy": true, 00:11:18.667 "nvme_iov_md": false 00:11:18.667 }, 00:11:18.667 "memory_domains": [ 00:11:18.667 { 00:11:18.667 "dma_device_id": "system", 00:11:18.667 "dma_device_type": 1 00:11:18.667 }, 00:11:18.667 { 00:11:18.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.667 "dma_device_type": 2 00:11:18.667 } 00:11:18.667 ], 00:11:18.667 "driver_specific": { 00:11:18.667 "passthru": { 00:11:18.667 "name": "pt1", 00:11:18.667 "base_bdev_name": "malloc1" 00:11:18.667 } 00:11:18.667 } 00:11:18.667 }' 00:11:18.667 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.927 17:23:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.927 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:18.927 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.927 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.927 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:18.927 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.927 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.927 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:18.927 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.187 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.187 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:19.187 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:19.187 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:19.187 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:19.447 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:19.447 "name": "pt2", 00:11:19.447 "aliases": [ 00:11:19.447 "00000000-0000-0000-0000-000000000002" 00:11:19.447 ], 00:11:19.447 "product_name": "passthru", 00:11:19.447 "block_size": 512, 00:11:19.447 "num_blocks": 65536, 00:11:19.447 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:19.448 "assigned_rate_limits": { 00:11:19.448 "rw_ios_per_sec": 0, 00:11:19.448 "rw_mbytes_per_sec": 0, 00:11:19.448 "r_mbytes_per_sec": 0, 00:11:19.448 "w_mbytes_per_sec": 0 00:11:19.448 }, 00:11:19.448 "claimed": true, 00:11:19.448 "claim_type": "exclusive_write", 00:11:19.448 "zoned": false, 00:11:19.448 "supported_io_types": { 00:11:19.448 "read": true, 00:11:19.448 "write": true, 00:11:19.448 "unmap": true, 00:11:19.448 "flush": true, 00:11:19.448 "reset": true, 00:11:19.448 "nvme_admin": false, 00:11:19.448 "nvme_io": false, 00:11:19.448 "nvme_io_md": false, 00:11:19.448 "write_zeroes": true, 00:11:19.448 "zcopy": true, 00:11:19.448 "get_zone_info": false, 00:11:19.448 "zone_management": false, 00:11:19.448 "zone_append": false, 00:11:19.448 "compare": false, 00:11:19.448 "compare_and_write": false, 00:11:19.448 "abort": true, 00:11:19.448 "seek_hole": false, 00:11:19.448 "seek_data": false, 00:11:19.448 "copy": true, 00:11:19.448 "nvme_iov_md": false 00:11:19.448 }, 00:11:19.448 "memory_domains": [ 00:11:19.448 { 00:11:19.448 "dma_device_id": "system", 00:11:19.448 "dma_device_type": 1 00:11:19.448 }, 00:11:19.448 { 00:11:19.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.448 "dma_device_type": 2 00:11:19.448 } 00:11:19.448 ], 00:11:19.448 "driver_specific": { 00:11:19.448 "passthru": { 00:11:19.448 "name": "pt2", 00:11:19.448 "base_bdev_name": "malloc2" 00:11:19.448 } 00:11:19.448 } 00:11:19.448 }' 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:19.448 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.708 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.708 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:19.708 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:19.708 17:23:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:19.708 [2024-07-15 17:23:30.998982] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' ee41fc46-88b3-4e67-8ce0-097179d11c34 '!=' ee41fc46-88b3-4e67-8ce0-097179d11c34 ']' 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2750900 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2750900 ']' 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2750900 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2750900 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2750900' 00:11:19.969 killing process with pid 2750900 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2750900 00:11:19.969 [2024-07-15 17:23:31.071951] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:19.969 [2024-07-15 17:23:31.071990] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:19.969 [2024-07-15 17:23:31.072018] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:19.969 [2024-07-15 17:23:31.072024] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x165e680 name raid_bdev1, state offline 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2750900 00:11:19.969 [2024-07-15 17:23:31.080997] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:19.969 00:11:19.969 real 0m8.949s 00:11:19.969 user 0m16.297s 00:11:19.969 sys 0m1.360s 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:19.969 17:23:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.969 ************************************ 00:11:19.969 END TEST raid_superblock_test 00:11:19.969 ************************************ 00:11:19.969 17:23:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:19.969 17:23:31 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:11:19.969 17:23:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:19.969 17:23:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:19.969 17:23:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:19.969 ************************************ 00:11:19.969 START TEST raid_read_error_test 00:11:19.969 ************************************ 00:11:19.969 17:23:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:11:19.969 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.5zKlKljDpz 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2752539 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2752539 /var/tmp/spdk-raid.sock 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2752539 ']' 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:20.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:20.230 17:23:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.230 [2024-07-15 17:23:31.333805] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:11:20.230 [2024-07-15 17:23:31.333858] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2752539 ] 00:11:20.230 [2024-07-15 17:23:31.422726] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:20.230 [2024-07-15 17:23:31.489720] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:20.490 [2024-07-15 17:23:31.540600] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:20.490 [2024-07-15 17:23:31.540626] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.060 17:23:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:21.060 17:23:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:21.060 17:23:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:21.060 17:23:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:21.060 BaseBdev1_malloc 00:11:21.320 17:23:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:21.320 true 00:11:21.320 17:23:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:21.580 [2024-07-15 17:23:32.728109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:21.581 [2024-07-15 17:23:32.728140] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:21.581 [2024-07-15 17:23:32.728151] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f1cb50 00:11:21.581 [2024-07-15 17:23:32.728158] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:21.581 [2024-07-15 17:23:32.729485] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:21.581 [2024-07-15 17:23:32.729506] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:21.581 BaseBdev1 00:11:21.581 17:23:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:21.581 17:23:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:21.840 BaseBdev2_malloc 00:11:21.840 17:23:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:21.840 true 00:11:21.840 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:22.099 [2024-07-15 17:23:33.295548] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:22.099 [2024-07-15 17:23:33.295577] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:22.099 [2024-07-15 17:23:33.295587] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f00ea0 00:11:22.099 [2024-07-15 17:23:33.295594] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:22.099 [2024-07-15 17:23:33.296796] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:22.099 [2024-07-15 17:23:33.296814] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:22.099 BaseBdev2 00:11:22.099 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:22.358 [2024-07-15 17:23:33.484054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:22.358 [2024-07-15 17:23:33.485072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:22.358 [2024-07-15 17:23:33.485208] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d6a360 00:11:22.358 [2024-07-15 17:23:33.485216] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:22.358 [2024-07-15 17:23:33.485362] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f05090 00:11:22.358 [2024-07-15 17:23:33.485473] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d6a360 00:11:22.358 [2024-07-15 17:23:33.485478] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d6a360 00:11:22.358 [2024-07-15 17:23:33.485554] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.358 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:22.617 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.617 "name": "raid_bdev1", 00:11:22.617 "uuid": "0f6d17f6-ff2b-4b22-9eca-13fb3f73d0c8", 00:11:22.617 "strip_size_kb": 64, 00:11:22.617 "state": "online", 00:11:22.617 "raid_level": "concat", 00:11:22.617 "superblock": true, 00:11:22.617 "num_base_bdevs": 2, 00:11:22.617 "num_base_bdevs_discovered": 2, 00:11:22.617 "num_base_bdevs_operational": 2, 00:11:22.617 "base_bdevs_list": [ 00:11:22.617 { 00:11:22.617 "name": "BaseBdev1", 00:11:22.617 "uuid": "5b5f7c7f-b50a-56d3-94f6-e175aebe63eb", 00:11:22.617 "is_configured": true, 00:11:22.617 "data_offset": 2048, 00:11:22.617 "data_size": 63488 00:11:22.617 }, 00:11:22.617 { 00:11:22.617 "name": "BaseBdev2", 00:11:22.617 "uuid": "eb4518ec-d5e9-50c0-bf8a-b96c3a7107b2", 00:11:22.617 "is_configured": true, 00:11:22.617 "data_offset": 2048, 00:11:22.617 "data_size": 63488 00:11:22.617 } 00:11:22.617 ] 00:11:22.617 }' 00:11:22.617 17:23:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.617 17:23:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.185 17:23:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:23.185 17:23:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:23.185 [2024-07-15 17:23:34.334405] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f04fd0 00:11:24.125 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:24.386 "name": "raid_bdev1", 00:11:24.386 "uuid": "0f6d17f6-ff2b-4b22-9eca-13fb3f73d0c8", 00:11:24.386 "strip_size_kb": 64, 00:11:24.386 "state": "online", 00:11:24.386 "raid_level": "concat", 00:11:24.386 "superblock": true, 00:11:24.386 "num_base_bdevs": 2, 00:11:24.386 "num_base_bdevs_discovered": 2, 00:11:24.386 "num_base_bdevs_operational": 2, 00:11:24.386 "base_bdevs_list": [ 00:11:24.386 { 00:11:24.386 "name": "BaseBdev1", 00:11:24.386 "uuid": "5b5f7c7f-b50a-56d3-94f6-e175aebe63eb", 00:11:24.386 "is_configured": true, 00:11:24.386 "data_offset": 2048, 00:11:24.386 "data_size": 63488 00:11:24.386 }, 00:11:24.386 { 00:11:24.386 "name": "BaseBdev2", 00:11:24.386 "uuid": "eb4518ec-d5e9-50c0-bf8a-b96c3a7107b2", 00:11:24.386 "is_configured": true, 00:11:24.386 "data_offset": 2048, 00:11:24.386 "data_size": 63488 00:11:24.386 } 00:11:24.386 ] 00:11:24.386 }' 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:24.386 17:23:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:24.957 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:25.216 [2024-07-15 17:23:36.382581] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:25.216 [2024-07-15 17:23:36.382607] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:25.216 [2024-07-15 17:23:36.385186] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:25.216 [2024-07-15 17:23:36.385210] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:25.216 [2024-07-15 17:23:36.385229] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:25.216 [2024-07-15 17:23:36.385235] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d6a360 name raid_bdev1, state offline 00:11:25.216 0 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2752539 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2752539 ']' 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2752539 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2752539 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2752539' 00:11:25.216 killing process with pid 2752539 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2752539 00:11:25.216 [2024-07-15 17:23:36.448852] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:25.216 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2752539 00:11:25.216 [2024-07-15 17:23:36.454364] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.5zKlKljDpz 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:11:25.477 00:11:25.477 real 0m5.321s 00:11:25.477 user 0m8.376s 00:11:25.477 sys 0m0.741s 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:25.477 17:23:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:25.477 ************************************ 00:11:25.478 END TEST raid_read_error_test 00:11:25.478 ************************************ 00:11:25.478 17:23:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:25.478 17:23:36 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:11:25.478 17:23:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:25.478 17:23:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:25.478 17:23:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:25.478 ************************************ 00:11:25.478 START TEST raid_write_error_test 00:11:25.478 ************************************ 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.cYNrpAClzp 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2753518 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2753518 /var/tmp/spdk-raid.sock 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2753518 ']' 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:25.478 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:25.478 17:23:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:25.478 [2024-07-15 17:23:36.731292] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:11:25.478 [2024-07-15 17:23:36.731349] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2753518 ] 00:11:25.739 [2024-07-15 17:23:36.820875] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:25.739 [2024-07-15 17:23:36.889903] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:25.739 [2024-07-15 17:23:36.935147] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:25.739 [2024-07-15 17:23:36.935173] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:26.309 17:23:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:26.309 17:23:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:26.309 17:23:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:26.309 17:23:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:26.569 BaseBdev1_malloc 00:11:26.569 17:23:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:26.830 true 00:11:26.830 17:23:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:26.830 [2024-07-15 17:23:38.110368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:26.830 [2024-07-15 17:23:38.110401] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:26.830 [2024-07-15 17:23:38.110415] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xab5b50 00:11:26.830 [2024-07-15 17:23:38.110421] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:26.830 [2024-07-15 17:23:38.111763] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:26.830 [2024-07-15 17:23:38.111783] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:26.830 BaseBdev1 00:11:26.830 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:26.830 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:27.091 BaseBdev2_malloc 00:11:27.091 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:27.388 true 00:11:27.388 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:27.388 [2024-07-15 17:23:38.677756] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:27.388 [2024-07-15 17:23:38.677784] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:27.388 [2024-07-15 17:23:38.677795] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa99ea0 00:11:27.388 [2024-07-15 17:23:38.677801] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:27.388 [2024-07-15 17:23:38.678991] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:27.388 [2024-07-15 17:23:38.679011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:27.388 BaseBdev2 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:27.648 [2024-07-15 17:23:38.866251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:27.648 [2024-07-15 17:23:38.867270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:27.648 [2024-07-15 17:23:38.867415] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x903360 00:11:27.648 [2024-07-15 17:23:38.867423] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:27.648 [2024-07-15 17:23:38.867569] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa9e090 00:11:27.648 [2024-07-15 17:23:38.867683] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x903360 00:11:27.648 [2024-07-15 17:23:38.867689] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x903360 00:11:27.648 [2024-07-15 17:23:38.867774] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:27.648 17:23:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:27.908 17:23:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:27.908 "name": "raid_bdev1", 00:11:27.908 "uuid": "c907f5b2-94db-42ff-aec6-edbb8db1c5ab", 00:11:27.908 "strip_size_kb": 64, 00:11:27.908 "state": "online", 00:11:27.908 "raid_level": "concat", 00:11:27.908 "superblock": true, 00:11:27.908 "num_base_bdevs": 2, 00:11:27.908 "num_base_bdevs_discovered": 2, 00:11:27.908 "num_base_bdevs_operational": 2, 00:11:27.908 "base_bdevs_list": [ 00:11:27.908 { 00:11:27.908 "name": "BaseBdev1", 00:11:27.908 "uuid": "19d21181-26b1-5157-8756-dac6e9747055", 00:11:27.908 "is_configured": true, 00:11:27.908 "data_offset": 2048, 00:11:27.908 "data_size": 63488 00:11:27.908 }, 00:11:27.908 { 00:11:27.908 "name": "BaseBdev2", 00:11:27.908 "uuid": "24b0e018-6ee4-5d77-bd56-1433a019cb56", 00:11:27.908 "is_configured": true, 00:11:27.908 "data_offset": 2048, 00:11:27.908 "data_size": 63488 00:11:27.908 } 00:11:27.908 ] 00:11:27.908 }' 00:11:27.908 17:23:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:27.908 17:23:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:28.478 17:23:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:28.478 17:23:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:28.478 [2024-07-15 17:23:39.700557] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa9dfd0 00:11:29.420 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.681 17:23:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:29.940 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:29.940 "name": "raid_bdev1", 00:11:29.940 "uuid": "c907f5b2-94db-42ff-aec6-edbb8db1c5ab", 00:11:29.940 "strip_size_kb": 64, 00:11:29.940 "state": "online", 00:11:29.940 "raid_level": "concat", 00:11:29.940 "superblock": true, 00:11:29.940 "num_base_bdevs": 2, 00:11:29.940 "num_base_bdevs_discovered": 2, 00:11:29.940 "num_base_bdevs_operational": 2, 00:11:29.940 "base_bdevs_list": [ 00:11:29.940 { 00:11:29.940 "name": "BaseBdev1", 00:11:29.940 "uuid": "19d21181-26b1-5157-8756-dac6e9747055", 00:11:29.940 "is_configured": true, 00:11:29.940 "data_offset": 2048, 00:11:29.940 "data_size": 63488 00:11:29.940 }, 00:11:29.940 { 00:11:29.940 "name": "BaseBdev2", 00:11:29.940 "uuid": "24b0e018-6ee4-5d77-bd56-1433a019cb56", 00:11:29.940 "is_configured": true, 00:11:29.940 "data_offset": 2048, 00:11:29.940 "data_size": 63488 00:11:29.940 } 00:11:29.940 ] 00:11:29.940 }' 00:11:29.941 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:29.941 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:30.509 [2024-07-15 17:23:41.725030] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:30.509 [2024-07-15 17:23:41.725066] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:30.509 [2024-07-15 17:23:41.727656] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:30.509 [2024-07-15 17:23:41.727680] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:30.509 [2024-07-15 17:23:41.727700] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:30.509 [2024-07-15 17:23:41.727706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x903360 name raid_bdev1, state offline 00:11:30.509 0 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2753518 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2753518 ']' 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2753518 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2753518 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2753518' 00:11:30.509 killing process with pid 2753518 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2753518 00:11:30.509 [2024-07-15 17:23:41.795078] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:30.509 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2753518 00:11:30.509 [2024-07-15 17:23:41.800741] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.cYNrpAClzp 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:30.769 00:11:30.769 real 0m5.273s 00:11:30.769 user 0m8.321s 00:11:30.769 sys 0m0.705s 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:30.769 17:23:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.769 ************************************ 00:11:30.769 END TEST raid_write_error_test 00:11:30.769 ************************************ 00:11:30.769 17:23:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:30.769 17:23:41 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:30.769 17:23:41 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:11:30.769 17:23:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:30.769 17:23:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:30.769 17:23:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:30.769 ************************************ 00:11:30.769 START TEST raid_state_function_test 00:11:30.769 ************************************ 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2754484 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2754484' 00:11:30.769 Process raid pid: 2754484 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2754484 /var/tmp/spdk-raid.sock 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2754484 ']' 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:30.769 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:30.769 17:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.028 [2024-07-15 17:23:42.079378] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:11:31.028 [2024-07-15 17:23:42.079430] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:31.028 [2024-07-15 17:23:42.169556] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:31.028 [2024-07-15 17:23:42.234061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:31.028 [2024-07-15 17:23:42.274444] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:31.028 [2024-07-15 17:23:42.274467] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:31.966 17:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:31.966 17:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:31.966 17:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:31.966 [2024-07-15 17:23:43.073693] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:31.966 [2024-07-15 17:23:43.073726] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:31.966 [2024-07-15 17:23:43.073732] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:31.966 [2024-07-15 17:23:43.073738] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.966 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:32.224 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.224 "name": "Existed_Raid", 00:11:32.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.224 "strip_size_kb": 0, 00:11:32.224 "state": "configuring", 00:11:32.224 "raid_level": "raid1", 00:11:32.224 "superblock": false, 00:11:32.224 "num_base_bdevs": 2, 00:11:32.224 "num_base_bdevs_discovered": 0, 00:11:32.224 "num_base_bdevs_operational": 2, 00:11:32.224 "base_bdevs_list": [ 00:11:32.224 { 00:11:32.224 "name": "BaseBdev1", 00:11:32.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.224 "is_configured": false, 00:11:32.224 "data_offset": 0, 00:11:32.224 "data_size": 0 00:11:32.224 }, 00:11:32.224 { 00:11:32.224 "name": "BaseBdev2", 00:11:32.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.224 "is_configured": false, 00:11:32.224 "data_offset": 0, 00:11:32.224 "data_size": 0 00:11:32.224 } 00:11:32.224 ] 00:11:32.224 }' 00:11:32.224 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.224 17:23:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.794 17:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:32.794 [2024-07-15 17:23:43.987918] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:32.794 [2024-07-15 17:23:43.987935] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ca66b0 name Existed_Raid, state configuring 00:11:32.794 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:33.053 [2024-07-15 17:23:44.148334] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:33.053 [2024-07-15 17:23:44.148355] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:33.053 [2024-07-15 17:23:44.148360] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:33.053 [2024-07-15 17:23:44.148366] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:33.053 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:33.053 [2024-07-15 17:23:44.347447] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:33.053 BaseBdev1 00:11:33.324 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:33.324 17:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:33.324 17:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:33.324 17:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:33.324 17:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:33.324 17:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:33.324 17:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:33.324 17:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:33.584 [ 00:11:33.584 { 00:11:33.584 "name": "BaseBdev1", 00:11:33.584 "aliases": [ 00:11:33.584 "4a7cfcae-8d67-48e4-8456-9159d40367f7" 00:11:33.584 ], 00:11:33.584 "product_name": "Malloc disk", 00:11:33.584 "block_size": 512, 00:11:33.584 "num_blocks": 65536, 00:11:33.584 "uuid": "4a7cfcae-8d67-48e4-8456-9159d40367f7", 00:11:33.584 "assigned_rate_limits": { 00:11:33.584 "rw_ios_per_sec": 0, 00:11:33.584 "rw_mbytes_per_sec": 0, 00:11:33.584 "r_mbytes_per_sec": 0, 00:11:33.584 "w_mbytes_per_sec": 0 00:11:33.584 }, 00:11:33.584 "claimed": true, 00:11:33.584 "claim_type": "exclusive_write", 00:11:33.584 "zoned": false, 00:11:33.584 "supported_io_types": { 00:11:33.584 "read": true, 00:11:33.584 "write": true, 00:11:33.584 "unmap": true, 00:11:33.584 "flush": true, 00:11:33.584 "reset": true, 00:11:33.584 "nvme_admin": false, 00:11:33.584 "nvme_io": false, 00:11:33.584 "nvme_io_md": false, 00:11:33.584 "write_zeroes": true, 00:11:33.584 "zcopy": true, 00:11:33.584 "get_zone_info": false, 00:11:33.584 "zone_management": false, 00:11:33.584 "zone_append": false, 00:11:33.584 "compare": false, 00:11:33.584 "compare_and_write": false, 00:11:33.584 "abort": true, 00:11:33.584 "seek_hole": false, 00:11:33.584 "seek_data": false, 00:11:33.584 "copy": true, 00:11:33.584 "nvme_iov_md": false 00:11:33.584 }, 00:11:33.584 "memory_domains": [ 00:11:33.584 { 00:11:33.584 "dma_device_id": "system", 00:11:33.584 "dma_device_type": 1 00:11:33.584 }, 00:11:33.584 { 00:11:33.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.584 "dma_device_type": 2 00:11:33.584 } 00:11:33.584 ], 00:11:33.584 "driver_specific": {} 00:11:33.584 } 00:11:33.584 ] 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.584 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.844 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.844 "name": "Existed_Raid", 00:11:33.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.844 "strip_size_kb": 0, 00:11:33.844 "state": "configuring", 00:11:33.844 "raid_level": "raid1", 00:11:33.844 "superblock": false, 00:11:33.844 "num_base_bdevs": 2, 00:11:33.844 "num_base_bdevs_discovered": 1, 00:11:33.844 "num_base_bdevs_operational": 2, 00:11:33.844 "base_bdevs_list": [ 00:11:33.844 { 00:11:33.844 "name": "BaseBdev1", 00:11:33.844 "uuid": "4a7cfcae-8d67-48e4-8456-9159d40367f7", 00:11:33.844 "is_configured": true, 00:11:33.845 "data_offset": 0, 00:11:33.845 "data_size": 65536 00:11:33.845 }, 00:11:33.845 { 00:11:33.845 "name": "BaseBdev2", 00:11:33.845 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.845 "is_configured": false, 00:11:33.845 "data_offset": 0, 00:11:33.845 "data_size": 0 00:11:33.845 } 00:11:33.845 ] 00:11:33.845 }' 00:11:33.845 17:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.845 17:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.414 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:34.414 [2024-07-15 17:23:45.670789] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:34.414 [2024-07-15 17:23:45.670813] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ca5fa0 name Existed_Raid, state configuring 00:11:34.414 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:34.673 [2024-07-15 17:23:45.863305] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:34.673 [2024-07-15 17:23:45.864453] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:34.673 [2024-07-15 17:23:45.864478] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.673 17:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:34.932 17:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:34.932 "name": "Existed_Raid", 00:11:34.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.932 "strip_size_kb": 0, 00:11:34.932 "state": "configuring", 00:11:34.932 "raid_level": "raid1", 00:11:34.932 "superblock": false, 00:11:34.932 "num_base_bdevs": 2, 00:11:34.932 "num_base_bdevs_discovered": 1, 00:11:34.932 "num_base_bdevs_operational": 2, 00:11:34.932 "base_bdevs_list": [ 00:11:34.932 { 00:11:34.932 "name": "BaseBdev1", 00:11:34.932 "uuid": "4a7cfcae-8d67-48e4-8456-9159d40367f7", 00:11:34.932 "is_configured": true, 00:11:34.932 "data_offset": 0, 00:11:34.932 "data_size": 65536 00:11:34.932 }, 00:11:34.932 { 00:11:34.932 "name": "BaseBdev2", 00:11:34.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.932 "is_configured": false, 00:11:34.932 "data_offset": 0, 00:11:34.932 "data_size": 0 00:11:34.932 } 00:11:34.932 ] 00:11:34.932 }' 00:11:34.932 17:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:34.932 17:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.500 17:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:35.759 [2024-07-15 17:23:46.826604] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:35.759 [2024-07-15 17:23:46.826633] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ca6d90 00:11:35.759 [2024-07-15 17:23:46.826637] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:11:35.759 [2024-07-15 17:23:46.826788] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e4a850 00:11:35.759 [2024-07-15 17:23:46.826883] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ca6d90 00:11:35.759 [2024-07-15 17:23:46.826889] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ca6d90 00:11:35.759 [2024-07-15 17:23:46.827014] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:35.759 BaseBdev2 00:11:35.759 17:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:35.759 17:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:35.759 17:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:35.759 17:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:35.759 17:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:35.759 17:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:35.759 17:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:35.759 17:23:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:36.018 [ 00:11:36.018 { 00:11:36.018 "name": "BaseBdev2", 00:11:36.018 "aliases": [ 00:11:36.018 "a9f83a76-0570-4dbd-9df9-104a7c714e98" 00:11:36.018 ], 00:11:36.018 "product_name": "Malloc disk", 00:11:36.018 "block_size": 512, 00:11:36.018 "num_blocks": 65536, 00:11:36.018 "uuid": "a9f83a76-0570-4dbd-9df9-104a7c714e98", 00:11:36.018 "assigned_rate_limits": { 00:11:36.018 "rw_ios_per_sec": 0, 00:11:36.018 "rw_mbytes_per_sec": 0, 00:11:36.018 "r_mbytes_per_sec": 0, 00:11:36.018 "w_mbytes_per_sec": 0 00:11:36.018 }, 00:11:36.018 "claimed": true, 00:11:36.018 "claim_type": "exclusive_write", 00:11:36.018 "zoned": false, 00:11:36.018 "supported_io_types": { 00:11:36.018 "read": true, 00:11:36.018 "write": true, 00:11:36.018 "unmap": true, 00:11:36.018 "flush": true, 00:11:36.018 "reset": true, 00:11:36.018 "nvme_admin": false, 00:11:36.018 "nvme_io": false, 00:11:36.018 "nvme_io_md": false, 00:11:36.018 "write_zeroes": true, 00:11:36.018 "zcopy": true, 00:11:36.018 "get_zone_info": false, 00:11:36.018 "zone_management": false, 00:11:36.018 "zone_append": false, 00:11:36.018 "compare": false, 00:11:36.018 "compare_and_write": false, 00:11:36.018 "abort": true, 00:11:36.018 "seek_hole": false, 00:11:36.018 "seek_data": false, 00:11:36.018 "copy": true, 00:11:36.018 "nvme_iov_md": false 00:11:36.018 }, 00:11:36.018 "memory_domains": [ 00:11:36.018 { 00:11:36.018 "dma_device_id": "system", 00:11:36.018 "dma_device_type": 1 00:11:36.018 }, 00:11:36.018 { 00:11:36.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.018 "dma_device_type": 2 00:11:36.018 } 00:11:36.018 ], 00:11:36.018 "driver_specific": {} 00:11:36.018 } 00:11:36.018 ] 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.018 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:36.277 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.277 "name": "Existed_Raid", 00:11:36.277 "uuid": "9098afbf-94d4-44b5-a49c-6b103a6b83ff", 00:11:36.277 "strip_size_kb": 0, 00:11:36.277 "state": "online", 00:11:36.277 "raid_level": "raid1", 00:11:36.277 "superblock": false, 00:11:36.277 "num_base_bdevs": 2, 00:11:36.277 "num_base_bdevs_discovered": 2, 00:11:36.277 "num_base_bdevs_operational": 2, 00:11:36.277 "base_bdevs_list": [ 00:11:36.277 { 00:11:36.277 "name": "BaseBdev1", 00:11:36.277 "uuid": "4a7cfcae-8d67-48e4-8456-9159d40367f7", 00:11:36.277 "is_configured": true, 00:11:36.277 "data_offset": 0, 00:11:36.277 "data_size": 65536 00:11:36.277 }, 00:11:36.277 { 00:11:36.277 "name": "BaseBdev2", 00:11:36.277 "uuid": "a9f83a76-0570-4dbd-9df9-104a7c714e98", 00:11:36.277 "is_configured": true, 00:11:36.277 "data_offset": 0, 00:11:36.277 "data_size": 65536 00:11:36.277 } 00:11:36.277 ] 00:11:36.277 }' 00:11:36.277 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.277 17:23:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.849 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:36.849 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:36.849 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:36.849 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:36.849 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:36.849 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:36.850 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:36.850 17:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:37.419 [2024-07-15 17:23:48.462978] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:37.420 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:37.420 "name": "Existed_Raid", 00:11:37.420 "aliases": [ 00:11:37.420 "9098afbf-94d4-44b5-a49c-6b103a6b83ff" 00:11:37.420 ], 00:11:37.420 "product_name": "Raid Volume", 00:11:37.420 "block_size": 512, 00:11:37.420 "num_blocks": 65536, 00:11:37.420 "uuid": "9098afbf-94d4-44b5-a49c-6b103a6b83ff", 00:11:37.420 "assigned_rate_limits": { 00:11:37.420 "rw_ios_per_sec": 0, 00:11:37.420 "rw_mbytes_per_sec": 0, 00:11:37.420 "r_mbytes_per_sec": 0, 00:11:37.420 "w_mbytes_per_sec": 0 00:11:37.420 }, 00:11:37.420 "claimed": false, 00:11:37.420 "zoned": false, 00:11:37.420 "supported_io_types": { 00:11:37.420 "read": true, 00:11:37.420 "write": true, 00:11:37.420 "unmap": false, 00:11:37.420 "flush": false, 00:11:37.420 "reset": true, 00:11:37.420 "nvme_admin": false, 00:11:37.420 "nvme_io": false, 00:11:37.420 "nvme_io_md": false, 00:11:37.420 "write_zeroes": true, 00:11:37.420 "zcopy": false, 00:11:37.420 "get_zone_info": false, 00:11:37.420 "zone_management": false, 00:11:37.420 "zone_append": false, 00:11:37.420 "compare": false, 00:11:37.420 "compare_and_write": false, 00:11:37.420 "abort": false, 00:11:37.420 "seek_hole": false, 00:11:37.420 "seek_data": false, 00:11:37.420 "copy": false, 00:11:37.420 "nvme_iov_md": false 00:11:37.420 }, 00:11:37.420 "memory_domains": [ 00:11:37.420 { 00:11:37.420 "dma_device_id": "system", 00:11:37.420 "dma_device_type": 1 00:11:37.420 }, 00:11:37.420 { 00:11:37.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.420 "dma_device_type": 2 00:11:37.420 }, 00:11:37.420 { 00:11:37.420 "dma_device_id": "system", 00:11:37.420 "dma_device_type": 1 00:11:37.420 }, 00:11:37.420 { 00:11:37.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.420 "dma_device_type": 2 00:11:37.420 } 00:11:37.420 ], 00:11:37.420 "driver_specific": { 00:11:37.420 "raid": { 00:11:37.420 "uuid": "9098afbf-94d4-44b5-a49c-6b103a6b83ff", 00:11:37.420 "strip_size_kb": 0, 00:11:37.420 "state": "online", 00:11:37.420 "raid_level": "raid1", 00:11:37.420 "superblock": false, 00:11:37.420 "num_base_bdevs": 2, 00:11:37.420 "num_base_bdevs_discovered": 2, 00:11:37.420 "num_base_bdevs_operational": 2, 00:11:37.420 "base_bdevs_list": [ 00:11:37.420 { 00:11:37.420 "name": "BaseBdev1", 00:11:37.420 "uuid": "4a7cfcae-8d67-48e4-8456-9159d40367f7", 00:11:37.420 "is_configured": true, 00:11:37.420 "data_offset": 0, 00:11:37.420 "data_size": 65536 00:11:37.420 }, 00:11:37.420 { 00:11:37.420 "name": "BaseBdev2", 00:11:37.420 "uuid": "a9f83a76-0570-4dbd-9df9-104a7c714e98", 00:11:37.420 "is_configured": true, 00:11:37.420 "data_offset": 0, 00:11:37.420 "data_size": 65536 00:11:37.420 } 00:11:37.420 ] 00:11:37.420 } 00:11:37.420 } 00:11:37.420 }' 00:11:37.420 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:37.420 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:37.420 BaseBdev2' 00:11:37.420 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:37.420 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:37.420 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:37.681 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:37.681 "name": "BaseBdev1", 00:11:37.681 "aliases": [ 00:11:37.681 "4a7cfcae-8d67-48e4-8456-9159d40367f7" 00:11:37.681 ], 00:11:37.681 "product_name": "Malloc disk", 00:11:37.681 "block_size": 512, 00:11:37.681 "num_blocks": 65536, 00:11:37.681 "uuid": "4a7cfcae-8d67-48e4-8456-9159d40367f7", 00:11:37.681 "assigned_rate_limits": { 00:11:37.681 "rw_ios_per_sec": 0, 00:11:37.681 "rw_mbytes_per_sec": 0, 00:11:37.681 "r_mbytes_per_sec": 0, 00:11:37.681 "w_mbytes_per_sec": 0 00:11:37.681 }, 00:11:37.681 "claimed": true, 00:11:37.681 "claim_type": "exclusive_write", 00:11:37.681 "zoned": false, 00:11:37.681 "supported_io_types": { 00:11:37.681 "read": true, 00:11:37.681 "write": true, 00:11:37.681 "unmap": true, 00:11:37.681 "flush": true, 00:11:37.681 "reset": true, 00:11:37.681 "nvme_admin": false, 00:11:37.681 "nvme_io": false, 00:11:37.681 "nvme_io_md": false, 00:11:37.681 "write_zeroes": true, 00:11:37.681 "zcopy": true, 00:11:37.681 "get_zone_info": false, 00:11:37.681 "zone_management": false, 00:11:37.681 "zone_append": false, 00:11:37.681 "compare": false, 00:11:37.681 "compare_and_write": false, 00:11:37.681 "abort": true, 00:11:37.681 "seek_hole": false, 00:11:37.681 "seek_data": false, 00:11:37.681 "copy": true, 00:11:37.681 "nvme_iov_md": false 00:11:37.681 }, 00:11:37.681 "memory_domains": [ 00:11:37.681 { 00:11:37.681 "dma_device_id": "system", 00:11:37.681 "dma_device_type": 1 00:11:37.681 }, 00:11:37.681 { 00:11:37.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.681 "dma_device_type": 2 00:11:37.681 } 00:11:37.681 ], 00:11:37.681 "driver_specific": {} 00:11:37.681 }' 00:11:37.681 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:37.681 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:37.681 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:37.681 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:37.681 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:37.681 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:37.681 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:37.942 17:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:37.942 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:37.942 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:37.942 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:37.942 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:37.942 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:37.942 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:37.942 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:38.202 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:38.202 "name": "BaseBdev2", 00:11:38.202 "aliases": [ 00:11:38.202 "a9f83a76-0570-4dbd-9df9-104a7c714e98" 00:11:38.202 ], 00:11:38.202 "product_name": "Malloc disk", 00:11:38.202 "block_size": 512, 00:11:38.202 "num_blocks": 65536, 00:11:38.202 "uuid": "a9f83a76-0570-4dbd-9df9-104a7c714e98", 00:11:38.202 "assigned_rate_limits": { 00:11:38.202 "rw_ios_per_sec": 0, 00:11:38.202 "rw_mbytes_per_sec": 0, 00:11:38.202 "r_mbytes_per_sec": 0, 00:11:38.202 "w_mbytes_per_sec": 0 00:11:38.202 }, 00:11:38.202 "claimed": true, 00:11:38.202 "claim_type": "exclusive_write", 00:11:38.202 "zoned": false, 00:11:38.202 "supported_io_types": { 00:11:38.202 "read": true, 00:11:38.202 "write": true, 00:11:38.202 "unmap": true, 00:11:38.202 "flush": true, 00:11:38.202 "reset": true, 00:11:38.202 "nvme_admin": false, 00:11:38.202 "nvme_io": false, 00:11:38.202 "nvme_io_md": false, 00:11:38.202 "write_zeroes": true, 00:11:38.202 "zcopy": true, 00:11:38.202 "get_zone_info": false, 00:11:38.202 "zone_management": false, 00:11:38.202 "zone_append": false, 00:11:38.202 "compare": false, 00:11:38.202 "compare_and_write": false, 00:11:38.202 "abort": true, 00:11:38.202 "seek_hole": false, 00:11:38.202 "seek_data": false, 00:11:38.202 "copy": true, 00:11:38.202 "nvme_iov_md": false 00:11:38.202 }, 00:11:38.202 "memory_domains": [ 00:11:38.202 { 00:11:38.202 "dma_device_id": "system", 00:11:38.202 "dma_device_type": 1 00:11:38.202 }, 00:11:38.202 { 00:11:38.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.202 "dma_device_type": 2 00:11:38.202 } 00:11:38.202 ], 00:11:38.202 "driver_specific": {} 00:11:38.202 }' 00:11:38.202 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.202 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.202 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:38.202 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.202 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.202 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:38.202 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.462 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.462 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:38.462 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.463 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.463 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:38.463 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:38.722 [2024-07-15 17:23:49.826239] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.722 17:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:38.981 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:38.981 "name": "Existed_Raid", 00:11:38.981 "uuid": "9098afbf-94d4-44b5-a49c-6b103a6b83ff", 00:11:38.981 "strip_size_kb": 0, 00:11:38.981 "state": "online", 00:11:38.981 "raid_level": "raid1", 00:11:38.981 "superblock": false, 00:11:38.981 "num_base_bdevs": 2, 00:11:38.981 "num_base_bdevs_discovered": 1, 00:11:38.981 "num_base_bdevs_operational": 1, 00:11:38.981 "base_bdevs_list": [ 00:11:38.981 { 00:11:38.981 "name": null, 00:11:38.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:38.981 "is_configured": false, 00:11:38.981 "data_offset": 0, 00:11:38.981 "data_size": 65536 00:11:38.981 }, 00:11:38.982 { 00:11:38.982 "name": "BaseBdev2", 00:11:38.982 "uuid": "a9f83a76-0570-4dbd-9df9-104a7c714e98", 00:11:38.982 "is_configured": true, 00:11:38.982 "data_offset": 0, 00:11:38.982 "data_size": 65536 00:11:38.982 } 00:11:38.982 ] 00:11:38.982 }' 00:11:38.982 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:38.982 17:23:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.552 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:39.552 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:39.552 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:39.552 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.552 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:39.552 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:39.552 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:39.813 [2024-07-15 17:23:50.933051] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:39.813 [2024-07-15 17:23:50.933115] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:39.813 [2024-07-15 17:23:50.939067] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:39.813 [2024-07-15 17:23:50.939092] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:39.813 [2024-07-15 17:23:50.939098] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ca6d90 name Existed_Raid, state offline 00:11:39.813 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:39.813 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:39.813 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.813 17:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2754484 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2754484 ']' 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2754484 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2754484 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2754484' 00:11:40.073 killing process with pid 2754484 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2754484 00:11:40.073 [2024-07-15 17:23:51.197081] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2754484 00:11:40.073 [2024-07-15 17:23:51.197688] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:40.073 00:11:40.073 real 0m9.301s 00:11:40.073 user 0m16.916s 00:11:40.073 sys 0m1.395s 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:40.073 17:23:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.073 ************************************ 00:11:40.073 END TEST raid_state_function_test 00:11:40.073 ************************************ 00:11:40.073 17:23:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:40.073 17:23:51 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:11:40.073 17:23:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:40.073 17:23:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:40.073 17:23:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:40.334 ************************************ 00:11:40.334 START TEST raid_state_function_test_sb 00:11:40.334 ************************************ 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2756400 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2756400' 00:11:40.334 Process raid pid: 2756400 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2756400 /var/tmp/spdk-raid.sock 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2756400 ']' 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:40.334 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:40.334 17:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:40.334 [2024-07-15 17:23:51.454729] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:11:40.334 [2024-07-15 17:23:51.454780] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:40.334 [2024-07-15 17:23:51.544567] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:40.334 [2024-07-15 17:23:51.612889] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:40.594 [2024-07-15 17:23:51.658987] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:40.594 [2024-07-15 17:23:51.659009] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.180 17:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:41.180 17:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:41.180 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:41.180 [2024-07-15 17:23:52.470170] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:41.180 [2024-07-15 17:23:52.470198] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:41.180 [2024-07-15 17:23:52.470204] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:41.180 [2024-07-15 17:23:52.470210] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:41.440 "name": "Existed_Raid", 00:11:41.440 "uuid": "f62f3f85-1db6-4773-aebc-5e1820f8b296", 00:11:41.440 "strip_size_kb": 0, 00:11:41.440 "state": "configuring", 00:11:41.440 "raid_level": "raid1", 00:11:41.440 "superblock": true, 00:11:41.440 "num_base_bdevs": 2, 00:11:41.440 "num_base_bdevs_discovered": 0, 00:11:41.440 "num_base_bdevs_operational": 2, 00:11:41.440 "base_bdevs_list": [ 00:11:41.440 { 00:11:41.440 "name": "BaseBdev1", 00:11:41.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:41.440 "is_configured": false, 00:11:41.440 "data_offset": 0, 00:11:41.440 "data_size": 0 00:11:41.440 }, 00:11:41.440 { 00:11:41.440 "name": "BaseBdev2", 00:11:41.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:41.440 "is_configured": false, 00:11:41.440 "data_offset": 0, 00:11:41.440 "data_size": 0 00:11:41.440 } 00:11:41.440 ] 00:11:41.440 }' 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:41.440 17:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:42.047 17:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:42.306 [2024-07-15 17:23:53.396387] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:42.306 [2024-07-15 17:23:53.396403] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdff6b0 name Existed_Raid, state configuring 00:11:42.306 17:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:42.306 [2024-07-15 17:23:53.588895] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:42.306 [2024-07-15 17:23:53.588913] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:42.306 [2024-07-15 17:23:53.588922] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:42.306 [2024-07-15 17:23:53.588927] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:42.307 17:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:42.567 [2024-07-15 17:23:53.772050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:42.567 BaseBdev1 00:11:42.567 17:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:42.567 17:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:42.567 17:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:42.567 17:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:42.567 17:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:42.567 17:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:42.567 17:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:42.827 17:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:43.086 [ 00:11:43.086 { 00:11:43.086 "name": "BaseBdev1", 00:11:43.086 "aliases": [ 00:11:43.086 "f8be4f7b-e0f3-442e-978d-657f0f23c72c" 00:11:43.086 ], 00:11:43.086 "product_name": "Malloc disk", 00:11:43.086 "block_size": 512, 00:11:43.086 "num_blocks": 65536, 00:11:43.086 "uuid": "f8be4f7b-e0f3-442e-978d-657f0f23c72c", 00:11:43.086 "assigned_rate_limits": { 00:11:43.086 "rw_ios_per_sec": 0, 00:11:43.086 "rw_mbytes_per_sec": 0, 00:11:43.086 "r_mbytes_per_sec": 0, 00:11:43.086 "w_mbytes_per_sec": 0 00:11:43.086 }, 00:11:43.086 "claimed": true, 00:11:43.086 "claim_type": "exclusive_write", 00:11:43.087 "zoned": false, 00:11:43.087 "supported_io_types": { 00:11:43.087 "read": true, 00:11:43.087 "write": true, 00:11:43.087 "unmap": true, 00:11:43.087 "flush": true, 00:11:43.087 "reset": true, 00:11:43.087 "nvme_admin": false, 00:11:43.087 "nvme_io": false, 00:11:43.087 "nvme_io_md": false, 00:11:43.087 "write_zeroes": true, 00:11:43.087 "zcopy": true, 00:11:43.087 "get_zone_info": false, 00:11:43.087 "zone_management": false, 00:11:43.087 "zone_append": false, 00:11:43.087 "compare": false, 00:11:43.087 "compare_and_write": false, 00:11:43.087 "abort": true, 00:11:43.087 "seek_hole": false, 00:11:43.087 "seek_data": false, 00:11:43.087 "copy": true, 00:11:43.087 "nvme_iov_md": false 00:11:43.087 }, 00:11:43.087 "memory_domains": [ 00:11:43.087 { 00:11:43.087 "dma_device_id": "system", 00:11:43.087 "dma_device_type": 1 00:11:43.087 }, 00:11:43.087 { 00:11:43.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.087 "dma_device_type": 2 00:11:43.087 } 00:11:43.087 ], 00:11:43.087 "driver_specific": {} 00:11:43.087 } 00:11:43.087 ] 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:43.087 "name": "Existed_Raid", 00:11:43.087 "uuid": "552b8f81-f55f-430f-ab7a-e774176be8c3", 00:11:43.087 "strip_size_kb": 0, 00:11:43.087 "state": "configuring", 00:11:43.087 "raid_level": "raid1", 00:11:43.087 "superblock": true, 00:11:43.087 "num_base_bdevs": 2, 00:11:43.087 "num_base_bdevs_discovered": 1, 00:11:43.087 "num_base_bdevs_operational": 2, 00:11:43.087 "base_bdevs_list": [ 00:11:43.087 { 00:11:43.087 "name": "BaseBdev1", 00:11:43.087 "uuid": "f8be4f7b-e0f3-442e-978d-657f0f23c72c", 00:11:43.087 "is_configured": true, 00:11:43.087 "data_offset": 2048, 00:11:43.087 "data_size": 63488 00:11:43.087 }, 00:11:43.087 { 00:11:43.087 "name": "BaseBdev2", 00:11:43.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.087 "is_configured": false, 00:11:43.087 "data_offset": 0, 00:11:43.087 "data_size": 0 00:11:43.087 } 00:11:43.087 ] 00:11:43.087 }' 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:43.087 17:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:43.658 17:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:43.918 [2024-07-15 17:23:55.067310] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:43.918 [2024-07-15 17:23:55.067336] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdfefa0 name Existed_Raid, state configuring 00:11:43.918 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:44.178 [2024-07-15 17:23:55.259824] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:44.178 [2024-07-15 17:23:55.260947] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:44.178 [2024-07-15 17:23:55.260972] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.178 "name": "Existed_Raid", 00:11:44.178 "uuid": "952f3d63-1ba0-4dd0-a48e-93f92069825e", 00:11:44.178 "strip_size_kb": 0, 00:11:44.178 "state": "configuring", 00:11:44.178 "raid_level": "raid1", 00:11:44.178 "superblock": true, 00:11:44.178 "num_base_bdevs": 2, 00:11:44.178 "num_base_bdevs_discovered": 1, 00:11:44.178 "num_base_bdevs_operational": 2, 00:11:44.178 "base_bdevs_list": [ 00:11:44.178 { 00:11:44.178 "name": "BaseBdev1", 00:11:44.178 "uuid": "f8be4f7b-e0f3-442e-978d-657f0f23c72c", 00:11:44.178 "is_configured": true, 00:11:44.178 "data_offset": 2048, 00:11:44.178 "data_size": 63488 00:11:44.178 }, 00:11:44.178 { 00:11:44.178 "name": "BaseBdev2", 00:11:44.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.178 "is_configured": false, 00:11:44.178 "data_offset": 0, 00:11:44.178 "data_size": 0 00:11:44.178 } 00:11:44.178 ] 00:11:44.178 }' 00:11:44.178 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.179 17:23:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:44.748 17:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:45.008 [2024-07-15 17:23:56.155036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:45.008 [2024-07-15 17:23:56.155145] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdffd90 00:11:45.008 [2024-07-15 17:23:56.155154] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:45.008 [2024-07-15 17:23:56.155292] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb38d0 00:11:45.008 [2024-07-15 17:23:56.155383] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdffd90 00:11:45.008 [2024-07-15 17:23:56.155389] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xdffd90 00:11:45.008 [2024-07-15 17:23:56.155456] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:45.008 BaseBdev2 00:11:45.008 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:45.008 17:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:45.008 17:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:45.008 17:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:45.008 17:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:45.008 17:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:45.008 17:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:45.268 [ 00:11:45.268 { 00:11:45.268 "name": "BaseBdev2", 00:11:45.268 "aliases": [ 00:11:45.268 "530fb089-715a-485a-9d69-f2830f758f78" 00:11:45.268 ], 00:11:45.268 "product_name": "Malloc disk", 00:11:45.268 "block_size": 512, 00:11:45.268 "num_blocks": 65536, 00:11:45.268 "uuid": "530fb089-715a-485a-9d69-f2830f758f78", 00:11:45.268 "assigned_rate_limits": { 00:11:45.268 "rw_ios_per_sec": 0, 00:11:45.268 "rw_mbytes_per_sec": 0, 00:11:45.268 "r_mbytes_per_sec": 0, 00:11:45.268 "w_mbytes_per_sec": 0 00:11:45.268 }, 00:11:45.268 "claimed": true, 00:11:45.268 "claim_type": "exclusive_write", 00:11:45.268 "zoned": false, 00:11:45.268 "supported_io_types": { 00:11:45.268 "read": true, 00:11:45.268 "write": true, 00:11:45.268 "unmap": true, 00:11:45.268 "flush": true, 00:11:45.268 "reset": true, 00:11:45.268 "nvme_admin": false, 00:11:45.268 "nvme_io": false, 00:11:45.268 "nvme_io_md": false, 00:11:45.268 "write_zeroes": true, 00:11:45.268 "zcopy": true, 00:11:45.268 "get_zone_info": false, 00:11:45.268 "zone_management": false, 00:11:45.268 "zone_append": false, 00:11:45.268 "compare": false, 00:11:45.268 "compare_and_write": false, 00:11:45.268 "abort": true, 00:11:45.268 "seek_hole": false, 00:11:45.268 "seek_data": false, 00:11:45.268 "copy": true, 00:11:45.268 "nvme_iov_md": false 00:11:45.268 }, 00:11:45.268 "memory_domains": [ 00:11:45.268 { 00:11:45.268 "dma_device_id": "system", 00:11:45.268 "dma_device_type": 1 00:11:45.268 }, 00:11:45.268 { 00:11:45.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.268 "dma_device_type": 2 00:11:45.268 } 00:11:45.268 ], 00:11:45.268 "driver_specific": {} 00:11:45.268 } 00:11:45.268 ] 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.268 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:45.528 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.528 "name": "Existed_Raid", 00:11:45.528 "uuid": "952f3d63-1ba0-4dd0-a48e-93f92069825e", 00:11:45.528 "strip_size_kb": 0, 00:11:45.528 "state": "online", 00:11:45.528 "raid_level": "raid1", 00:11:45.528 "superblock": true, 00:11:45.528 "num_base_bdevs": 2, 00:11:45.528 "num_base_bdevs_discovered": 2, 00:11:45.528 "num_base_bdevs_operational": 2, 00:11:45.528 "base_bdevs_list": [ 00:11:45.528 { 00:11:45.528 "name": "BaseBdev1", 00:11:45.528 "uuid": "f8be4f7b-e0f3-442e-978d-657f0f23c72c", 00:11:45.528 "is_configured": true, 00:11:45.528 "data_offset": 2048, 00:11:45.528 "data_size": 63488 00:11:45.529 }, 00:11:45.529 { 00:11:45.529 "name": "BaseBdev2", 00:11:45.529 "uuid": "530fb089-715a-485a-9d69-f2830f758f78", 00:11:45.529 "is_configured": true, 00:11:45.529 "data_offset": 2048, 00:11:45.529 "data_size": 63488 00:11:45.529 } 00:11:45.529 ] 00:11:45.529 }' 00:11:45.529 17:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.529 17:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:46.100 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:46.100 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:46.100 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:46.100 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:46.100 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:46.100 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:46.100 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:46.100 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:46.361 [2024-07-15 17:23:57.446518] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:46.361 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:46.361 "name": "Existed_Raid", 00:11:46.361 "aliases": [ 00:11:46.361 "952f3d63-1ba0-4dd0-a48e-93f92069825e" 00:11:46.361 ], 00:11:46.361 "product_name": "Raid Volume", 00:11:46.361 "block_size": 512, 00:11:46.361 "num_blocks": 63488, 00:11:46.361 "uuid": "952f3d63-1ba0-4dd0-a48e-93f92069825e", 00:11:46.361 "assigned_rate_limits": { 00:11:46.361 "rw_ios_per_sec": 0, 00:11:46.361 "rw_mbytes_per_sec": 0, 00:11:46.361 "r_mbytes_per_sec": 0, 00:11:46.361 "w_mbytes_per_sec": 0 00:11:46.361 }, 00:11:46.361 "claimed": false, 00:11:46.361 "zoned": false, 00:11:46.361 "supported_io_types": { 00:11:46.361 "read": true, 00:11:46.361 "write": true, 00:11:46.361 "unmap": false, 00:11:46.361 "flush": false, 00:11:46.361 "reset": true, 00:11:46.361 "nvme_admin": false, 00:11:46.361 "nvme_io": false, 00:11:46.361 "nvme_io_md": false, 00:11:46.361 "write_zeroes": true, 00:11:46.361 "zcopy": false, 00:11:46.361 "get_zone_info": false, 00:11:46.361 "zone_management": false, 00:11:46.361 "zone_append": false, 00:11:46.361 "compare": false, 00:11:46.361 "compare_and_write": false, 00:11:46.361 "abort": false, 00:11:46.361 "seek_hole": false, 00:11:46.361 "seek_data": false, 00:11:46.361 "copy": false, 00:11:46.361 "nvme_iov_md": false 00:11:46.361 }, 00:11:46.361 "memory_domains": [ 00:11:46.361 { 00:11:46.361 "dma_device_id": "system", 00:11:46.361 "dma_device_type": 1 00:11:46.361 }, 00:11:46.361 { 00:11:46.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.361 "dma_device_type": 2 00:11:46.361 }, 00:11:46.361 { 00:11:46.361 "dma_device_id": "system", 00:11:46.361 "dma_device_type": 1 00:11:46.361 }, 00:11:46.361 { 00:11:46.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.361 "dma_device_type": 2 00:11:46.361 } 00:11:46.361 ], 00:11:46.361 "driver_specific": { 00:11:46.361 "raid": { 00:11:46.361 "uuid": "952f3d63-1ba0-4dd0-a48e-93f92069825e", 00:11:46.361 "strip_size_kb": 0, 00:11:46.361 "state": "online", 00:11:46.361 "raid_level": "raid1", 00:11:46.361 "superblock": true, 00:11:46.361 "num_base_bdevs": 2, 00:11:46.361 "num_base_bdevs_discovered": 2, 00:11:46.361 "num_base_bdevs_operational": 2, 00:11:46.361 "base_bdevs_list": [ 00:11:46.361 { 00:11:46.361 "name": "BaseBdev1", 00:11:46.361 "uuid": "f8be4f7b-e0f3-442e-978d-657f0f23c72c", 00:11:46.361 "is_configured": true, 00:11:46.361 "data_offset": 2048, 00:11:46.361 "data_size": 63488 00:11:46.361 }, 00:11:46.361 { 00:11:46.361 "name": "BaseBdev2", 00:11:46.361 "uuid": "530fb089-715a-485a-9d69-f2830f758f78", 00:11:46.361 "is_configured": true, 00:11:46.361 "data_offset": 2048, 00:11:46.361 "data_size": 63488 00:11:46.361 } 00:11:46.361 ] 00:11:46.361 } 00:11:46.361 } 00:11:46.361 }' 00:11:46.361 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:46.361 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:46.361 BaseBdev2' 00:11:46.361 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:46.361 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:46.361 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:46.622 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:46.622 "name": "BaseBdev1", 00:11:46.622 "aliases": [ 00:11:46.622 "f8be4f7b-e0f3-442e-978d-657f0f23c72c" 00:11:46.622 ], 00:11:46.622 "product_name": "Malloc disk", 00:11:46.622 "block_size": 512, 00:11:46.622 "num_blocks": 65536, 00:11:46.622 "uuid": "f8be4f7b-e0f3-442e-978d-657f0f23c72c", 00:11:46.622 "assigned_rate_limits": { 00:11:46.622 "rw_ios_per_sec": 0, 00:11:46.622 "rw_mbytes_per_sec": 0, 00:11:46.622 "r_mbytes_per_sec": 0, 00:11:46.622 "w_mbytes_per_sec": 0 00:11:46.622 }, 00:11:46.622 "claimed": true, 00:11:46.622 "claim_type": "exclusive_write", 00:11:46.622 "zoned": false, 00:11:46.622 "supported_io_types": { 00:11:46.622 "read": true, 00:11:46.622 "write": true, 00:11:46.622 "unmap": true, 00:11:46.622 "flush": true, 00:11:46.622 "reset": true, 00:11:46.622 "nvme_admin": false, 00:11:46.622 "nvme_io": false, 00:11:46.622 "nvme_io_md": false, 00:11:46.622 "write_zeroes": true, 00:11:46.622 "zcopy": true, 00:11:46.622 "get_zone_info": false, 00:11:46.622 "zone_management": false, 00:11:46.622 "zone_append": false, 00:11:46.622 "compare": false, 00:11:46.622 "compare_and_write": false, 00:11:46.622 "abort": true, 00:11:46.622 "seek_hole": false, 00:11:46.622 "seek_data": false, 00:11:46.622 "copy": true, 00:11:46.622 "nvme_iov_md": false 00:11:46.622 }, 00:11:46.622 "memory_domains": [ 00:11:46.622 { 00:11:46.622 "dma_device_id": "system", 00:11:46.622 "dma_device_type": 1 00:11:46.622 }, 00:11:46.622 { 00:11:46.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.622 "dma_device_type": 2 00:11:46.622 } 00:11:46.622 ], 00:11:46.622 "driver_specific": {} 00:11:46.622 }' 00:11:46.622 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.622 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.622 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:46.622 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:46.622 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:46.622 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:46.622 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:46.882 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:46.882 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:46.882 17:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.882 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.882 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:46.882 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:46.882 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:46.882 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:47.142 "name": "BaseBdev2", 00:11:47.142 "aliases": [ 00:11:47.142 "530fb089-715a-485a-9d69-f2830f758f78" 00:11:47.142 ], 00:11:47.142 "product_name": "Malloc disk", 00:11:47.142 "block_size": 512, 00:11:47.142 "num_blocks": 65536, 00:11:47.142 "uuid": "530fb089-715a-485a-9d69-f2830f758f78", 00:11:47.142 "assigned_rate_limits": { 00:11:47.142 "rw_ios_per_sec": 0, 00:11:47.142 "rw_mbytes_per_sec": 0, 00:11:47.142 "r_mbytes_per_sec": 0, 00:11:47.142 "w_mbytes_per_sec": 0 00:11:47.142 }, 00:11:47.142 "claimed": true, 00:11:47.142 "claim_type": "exclusive_write", 00:11:47.142 "zoned": false, 00:11:47.142 "supported_io_types": { 00:11:47.142 "read": true, 00:11:47.142 "write": true, 00:11:47.142 "unmap": true, 00:11:47.142 "flush": true, 00:11:47.142 "reset": true, 00:11:47.142 "nvme_admin": false, 00:11:47.142 "nvme_io": false, 00:11:47.142 "nvme_io_md": false, 00:11:47.142 "write_zeroes": true, 00:11:47.142 "zcopy": true, 00:11:47.142 "get_zone_info": false, 00:11:47.142 "zone_management": false, 00:11:47.142 "zone_append": false, 00:11:47.142 "compare": false, 00:11:47.142 "compare_and_write": false, 00:11:47.142 "abort": true, 00:11:47.142 "seek_hole": false, 00:11:47.142 "seek_data": false, 00:11:47.142 "copy": true, 00:11:47.142 "nvme_iov_md": false 00:11:47.142 }, 00:11:47.142 "memory_domains": [ 00:11:47.142 { 00:11:47.142 "dma_device_id": "system", 00:11:47.142 "dma_device_type": 1 00:11:47.142 }, 00:11:47.142 { 00:11:47.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.142 "dma_device_type": 2 00:11:47.142 } 00:11:47.142 ], 00:11:47.142 "driver_specific": {} 00:11:47.142 }' 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:47.142 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:47.402 [2024-07-15 17:23:58.677457] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:47.402 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:47.403 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:47.403 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:47.403 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.403 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.663 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.663 "name": "Existed_Raid", 00:11:47.663 "uuid": "952f3d63-1ba0-4dd0-a48e-93f92069825e", 00:11:47.663 "strip_size_kb": 0, 00:11:47.663 "state": "online", 00:11:47.663 "raid_level": "raid1", 00:11:47.663 "superblock": true, 00:11:47.663 "num_base_bdevs": 2, 00:11:47.663 "num_base_bdevs_discovered": 1, 00:11:47.663 "num_base_bdevs_operational": 1, 00:11:47.663 "base_bdevs_list": [ 00:11:47.663 { 00:11:47.663 "name": null, 00:11:47.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.663 "is_configured": false, 00:11:47.663 "data_offset": 2048, 00:11:47.663 "data_size": 63488 00:11:47.663 }, 00:11:47.663 { 00:11:47.663 "name": "BaseBdev2", 00:11:47.663 "uuid": "530fb089-715a-485a-9d69-f2830f758f78", 00:11:47.663 "is_configured": true, 00:11:47.663 "data_offset": 2048, 00:11:47.663 "data_size": 63488 00:11:47.663 } 00:11:47.663 ] 00:11:47.663 }' 00:11:47.663 17:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.663 17:23:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:48.234 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:48.234 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:48.234 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.234 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:48.495 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:48.495 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:48.495 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:48.756 [2024-07-15 17:23:59.808320] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:48.756 [2024-07-15 17:23:59.808384] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:48.756 [2024-07-15 17:23:59.814440] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:48.756 [2024-07-15 17:23:59.814463] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:48.756 [2024-07-15 17:23:59.814469] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdffd90 name Existed_Raid, state offline 00:11:48.756 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:48.756 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:48.756 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.756 17:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:48.756 17:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:48.756 17:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:48.756 17:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:48.756 17:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2756400 00:11:48.756 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2756400 ']' 00:11:48.756 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2756400 00:11:48.756 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:48.756 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:48.756 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2756400 00:11:49.017 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:49.017 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:49.017 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2756400' 00:11:49.017 killing process with pid 2756400 00:11:49.017 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2756400 00:11:49.017 [2024-07-15 17:24:00.071120] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:49.017 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2756400 00:11:49.017 [2024-07-15 17:24:00.071744] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:49.017 17:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:49.017 00:11:49.017 real 0m8.798s 00:11:49.017 user 0m15.978s 00:11:49.017 sys 0m1.332s 00:11:49.017 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:49.017 17:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.017 ************************************ 00:11:49.017 END TEST raid_state_function_test_sb 00:11:49.017 ************************************ 00:11:49.017 17:24:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:49.017 17:24:00 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:11:49.017 17:24:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:49.017 17:24:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:49.017 17:24:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:49.017 ************************************ 00:11:49.017 START TEST raid_superblock_test 00:11:49.017 ************************************ 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:49.017 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2758019 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2758019 /var/tmp/spdk-raid.sock 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2758019 ']' 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:49.018 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:49.018 17:24:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:49.278 [2024-07-15 17:24:00.319133] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:11:49.278 [2024-07-15 17:24:00.319181] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2758019 ] 00:11:49.278 [2024-07-15 17:24:00.408149] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:49.278 [2024-07-15 17:24:00.473879] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:49.278 [2024-07-15 17:24:00.519708] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:49.278 [2024-07-15 17:24:00.519734] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:50.219 malloc1 00:11:50.219 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:50.479 [2024-07-15 17:24:01.542458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:50.479 [2024-07-15 17:24:01.542493] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:50.479 [2024-07-15 17:24:01.542506] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1473a20 00:11:50.479 [2024-07-15 17:24:01.542513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:50.479 [2024-07-15 17:24:01.543836] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:50.479 [2024-07-15 17:24:01.543855] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:50.479 pt1 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:50.479 malloc2 00:11:50.479 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:50.738 [2024-07-15 17:24:01.925549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:50.738 [2024-07-15 17:24:01.925582] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:50.738 [2024-07-15 17:24:01.925595] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1474040 00:11:50.738 [2024-07-15 17:24:01.925601] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:50.738 [2024-07-15 17:24:01.926820] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:50.738 [2024-07-15 17:24:01.926838] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:50.738 pt2 00:11:50.738 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:50.738 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:50.738 17:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:50.998 [2024-07-15 17:24:02.114039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:50.998 [2024-07-15 17:24:02.115056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:50.998 [2024-07-15 17:24:02.115165] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16203d0 00:11:50.998 [2024-07-15 17:24:02.115173] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:50.998 [2024-07-15 17:24:02.115322] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x148a910 00:11:50.998 [2024-07-15 17:24:02.115434] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16203d0 00:11:50.998 [2024-07-15 17:24:02.115439] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16203d0 00:11:50.998 [2024-07-15 17:24:02.115511] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.999 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:51.258 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:51.258 "name": "raid_bdev1", 00:11:51.258 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:11:51.258 "strip_size_kb": 0, 00:11:51.258 "state": "online", 00:11:51.258 "raid_level": "raid1", 00:11:51.258 "superblock": true, 00:11:51.258 "num_base_bdevs": 2, 00:11:51.258 "num_base_bdevs_discovered": 2, 00:11:51.258 "num_base_bdevs_operational": 2, 00:11:51.258 "base_bdevs_list": [ 00:11:51.258 { 00:11:51.258 "name": "pt1", 00:11:51.258 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:51.258 "is_configured": true, 00:11:51.258 "data_offset": 2048, 00:11:51.258 "data_size": 63488 00:11:51.258 }, 00:11:51.258 { 00:11:51.258 "name": "pt2", 00:11:51.258 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:51.258 "is_configured": true, 00:11:51.258 "data_offset": 2048, 00:11:51.258 "data_size": 63488 00:11:51.258 } 00:11:51.258 ] 00:11:51.258 }' 00:11:51.258 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:51.258 17:24:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.829 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:51.829 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:51.829 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:51.829 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:51.829 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:51.829 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:51.829 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:51.829 17:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:51.829 [2024-07-15 17:24:03.040561] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:51.829 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:51.829 "name": "raid_bdev1", 00:11:51.829 "aliases": [ 00:11:51.829 "773bdf98-240e-436c-bd47-592fae02c20c" 00:11:51.829 ], 00:11:51.829 "product_name": "Raid Volume", 00:11:51.829 "block_size": 512, 00:11:51.829 "num_blocks": 63488, 00:11:51.829 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:11:51.829 "assigned_rate_limits": { 00:11:51.829 "rw_ios_per_sec": 0, 00:11:51.829 "rw_mbytes_per_sec": 0, 00:11:51.829 "r_mbytes_per_sec": 0, 00:11:51.829 "w_mbytes_per_sec": 0 00:11:51.829 }, 00:11:51.829 "claimed": false, 00:11:51.829 "zoned": false, 00:11:51.829 "supported_io_types": { 00:11:51.829 "read": true, 00:11:51.829 "write": true, 00:11:51.829 "unmap": false, 00:11:51.829 "flush": false, 00:11:51.829 "reset": true, 00:11:51.829 "nvme_admin": false, 00:11:51.829 "nvme_io": false, 00:11:51.829 "nvme_io_md": false, 00:11:51.829 "write_zeroes": true, 00:11:51.829 "zcopy": false, 00:11:51.829 "get_zone_info": false, 00:11:51.829 "zone_management": false, 00:11:51.829 "zone_append": false, 00:11:51.829 "compare": false, 00:11:51.829 "compare_and_write": false, 00:11:51.829 "abort": false, 00:11:51.829 "seek_hole": false, 00:11:51.829 "seek_data": false, 00:11:51.829 "copy": false, 00:11:51.829 "nvme_iov_md": false 00:11:51.829 }, 00:11:51.829 "memory_domains": [ 00:11:51.829 { 00:11:51.829 "dma_device_id": "system", 00:11:51.829 "dma_device_type": 1 00:11:51.829 }, 00:11:51.829 { 00:11:51.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.829 "dma_device_type": 2 00:11:51.829 }, 00:11:51.829 { 00:11:51.829 "dma_device_id": "system", 00:11:51.829 "dma_device_type": 1 00:11:51.829 }, 00:11:51.829 { 00:11:51.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.829 "dma_device_type": 2 00:11:51.829 } 00:11:51.829 ], 00:11:51.829 "driver_specific": { 00:11:51.829 "raid": { 00:11:51.829 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:11:51.829 "strip_size_kb": 0, 00:11:51.829 "state": "online", 00:11:51.829 "raid_level": "raid1", 00:11:51.829 "superblock": true, 00:11:51.829 "num_base_bdevs": 2, 00:11:51.829 "num_base_bdevs_discovered": 2, 00:11:51.829 "num_base_bdevs_operational": 2, 00:11:51.829 "base_bdevs_list": [ 00:11:51.829 { 00:11:51.829 "name": "pt1", 00:11:51.829 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:51.829 "is_configured": true, 00:11:51.829 "data_offset": 2048, 00:11:51.829 "data_size": 63488 00:11:51.829 }, 00:11:51.829 { 00:11:51.829 "name": "pt2", 00:11:51.829 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:51.829 "is_configured": true, 00:11:51.829 "data_offset": 2048, 00:11:51.829 "data_size": 63488 00:11:51.829 } 00:11:51.829 ] 00:11:51.829 } 00:11:51.829 } 00:11:51.829 }' 00:11:51.829 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:51.829 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:51.829 pt2' 00:11:51.829 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.829 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:51.829 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:52.089 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:52.089 "name": "pt1", 00:11:52.089 "aliases": [ 00:11:52.089 "00000000-0000-0000-0000-000000000001" 00:11:52.089 ], 00:11:52.089 "product_name": "passthru", 00:11:52.089 "block_size": 512, 00:11:52.089 "num_blocks": 65536, 00:11:52.089 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:52.089 "assigned_rate_limits": { 00:11:52.089 "rw_ios_per_sec": 0, 00:11:52.089 "rw_mbytes_per_sec": 0, 00:11:52.089 "r_mbytes_per_sec": 0, 00:11:52.089 "w_mbytes_per_sec": 0 00:11:52.089 }, 00:11:52.089 "claimed": true, 00:11:52.089 "claim_type": "exclusive_write", 00:11:52.089 "zoned": false, 00:11:52.089 "supported_io_types": { 00:11:52.089 "read": true, 00:11:52.089 "write": true, 00:11:52.089 "unmap": true, 00:11:52.089 "flush": true, 00:11:52.089 "reset": true, 00:11:52.089 "nvme_admin": false, 00:11:52.089 "nvme_io": false, 00:11:52.089 "nvme_io_md": false, 00:11:52.089 "write_zeroes": true, 00:11:52.089 "zcopy": true, 00:11:52.089 "get_zone_info": false, 00:11:52.089 "zone_management": false, 00:11:52.089 "zone_append": false, 00:11:52.089 "compare": false, 00:11:52.089 "compare_and_write": false, 00:11:52.089 "abort": true, 00:11:52.089 "seek_hole": false, 00:11:52.089 "seek_data": false, 00:11:52.089 "copy": true, 00:11:52.089 "nvme_iov_md": false 00:11:52.089 }, 00:11:52.089 "memory_domains": [ 00:11:52.089 { 00:11:52.089 "dma_device_id": "system", 00:11:52.089 "dma_device_type": 1 00:11:52.089 }, 00:11:52.089 { 00:11:52.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.089 "dma_device_type": 2 00:11:52.089 } 00:11:52.089 ], 00:11:52.089 "driver_specific": { 00:11:52.089 "passthru": { 00:11:52.089 "name": "pt1", 00:11:52.089 "base_bdev_name": "malloc1" 00:11:52.089 } 00:11:52.089 } 00:11:52.089 }' 00:11:52.089 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.089 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:52.348 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:52.608 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:52.608 "name": "pt2", 00:11:52.608 "aliases": [ 00:11:52.608 "00000000-0000-0000-0000-000000000002" 00:11:52.608 ], 00:11:52.608 "product_name": "passthru", 00:11:52.608 "block_size": 512, 00:11:52.608 "num_blocks": 65536, 00:11:52.608 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:52.608 "assigned_rate_limits": { 00:11:52.608 "rw_ios_per_sec": 0, 00:11:52.608 "rw_mbytes_per_sec": 0, 00:11:52.608 "r_mbytes_per_sec": 0, 00:11:52.608 "w_mbytes_per_sec": 0 00:11:52.608 }, 00:11:52.608 "claimed": true, 00:11:52.608 "claim_type": "exclusive_write", 00:11:52.608 "zoned": false, 00:11:52.608 "supported_io_types": { 00:11:52.608 "read": true, 00:11:52.608 "write": true, 00:11:52.608 "unmap": true, 00:11:52.608 "flush": true, 00:11:52.608 "reset": true, 00:11:52.608 "nvme_admin": false, 00:11:52.608 "nvme_io": false, 00:11:52.608 "nvme_io_md": false, 00:11:52.608 "write_zeroes": true, 00:11:52.608 "zcopy": true, 00:11:52.608 "get_zone_info": false, 00:11:52.608 "zone_management": false, 00:11:52.608 "zone_append": false, 00:11:52.608 "compare": false, 00:11:52.608 "compare_and_write": false, 00:11:52.608 "abort": true, 00:11:52.608 "seek_hole": false, 00:11:52.608 "seek_data": false, 00:11:52.608 "copy": true, 00:11:52.608 "nvme_iov_md": false 00:11:52.608 }, 00:11:52.608 "memory_domains": [ 00:11:52.608 { 00:11:52.608 "dma_device_id": "system", 00:11:52.608 "dma_device_type": 1 00:11:52.608 }, 00:11:52.608 { 00:11:52.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.608 "dma_device_type": 2 00:11:52.608 } 00:11:52.608 ], 00:11:52.608 "driver_specific": { 00:11:52.608 "passthru": { 00:11:52.608 "name": "pt2", 00:11:52.608 "base_bdev_name": "malloc2" 00:11:52.608 } 00:11:52.608 } 00:11:52.608 }' 00:11:52.608 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.608 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.608 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:52.608 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.868 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.868 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:52.868 17:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.868 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.868 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.868 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.868 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.868 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:53.128 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:53.128 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:53.128 [2024-07-15 17:24:04.343858] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:53.128 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=773bdf98-240e-436c-bd47-592fae02c20c 00:11:53.128 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 773bdf98-240e-436c-bd47-592fae02c20c ']' 00:11:53.128 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:53.388 [2024-07-15 17:24:04.536134] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:53.389 [2024-07-15 17:24:04.536145] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:53.389 [2024-07-15 17:24:04.536181] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:53.389 [2024-07-15 17:24:04.536221] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:53.389 [2024-07-15 17:24:04.536227] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16203d0 name raid_bdev1, state offline 00:11:53.389 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.389 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:53.648 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:53.648 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:53.648 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:53.648 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:53.648 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:53.648 17:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:53.908 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:53.908 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:54.168 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:54.428 [2024-07-15 17:24:05.486518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:54.428 [2024-07-15 17:24:05.487586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:54.428 [2024-07-15 17:24:05.487629] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:54.428 [2024-07-15 17:24:05.487658] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:54.428 [2024-07-15 17:24:05.487668] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:54.428 [2024-07-15 17:24:05.487674] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x161e960 name raid_bdev1, state configuring 00:11:54.428 request: 00:11:54.428 { 00:11:54.428 "name": "raid_bdev1", 00:11:54.428 "raid_level": "raid1", 00:11:54.428 "base_bdevs": [ 00:11:54.428 "malloc1", 00:11:54.428 "malloc2" 00:11:54.428 ], 00:11:54.428 "superblock": false, 00:11:54.428 "method": "bdev_raid_create", 00:11:54.428 "req_id": 1 00:11:54.428 } 00:11:54.428 Got JSON-RPC error response 00:11:54.428 response: 00:11:54.428 { 00:11:54.428 "code": -17, 00:11:54.428 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:54.428 } 00:11:54.428 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:54.428 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:54.428 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:54.428 17:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:54.428 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.428 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:54.428 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:54.428 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:54.428 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:54.689 [2024-07-15 17:24:05.871449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:54.689 [2024-07-15 17:24:05.871478] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:54.689 [2024-07-15 17:24:05.871491] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161fba0 00:11:54.689 [2024-07-15 17:24:05.871497] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:54.689 [2024-07-15 17:24:05.872762] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:54.689 [2024-07-15 17:24:05.872782] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:54.689 [2024-07-15 17:24:05.872827] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:54.689 [2024-07-15 17:24:05.872844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:54.689 pt1 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.689 17:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:54.949 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.949 "name": "raid_bdev1", 00:11:54.949 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:11:54.949 "strip_size_kb": 0, 00:11:54.949 "state": "configuring", 00:11:54.949 "raid_level": "raid1", 00:11:54.949 "superblock": true, 00:11:54.949 "num_base_bdevs": 2, 00:11:54.949 "num_base_bdevs_discovered": 1, 00:11:54.949 "num_base_bdevs_operational": 2, 00:11:54.949 "base_bdevs_list": [ 00:11:54.949 { 00:11:54.949 "name": "pt1", 00:11:54.949 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:54.949 "is_configured": true, 00:11:54.949 "data_offset": 2048, 00:11:54.949 "data_size": 63488 00:11:54.949 }, 00:11:54.949 { 00:11:54.949 "name": null, 00:11:54.949 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:54.949 "is_configured": false, 00:11:54.949 "data_offset": 2048, 00:11:54.949 "data_size": 63488 00:11:54.949 } 00:11:54.949 ] 00:11:54.949 }' 00:11:54.949 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.949 17:24:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:55.519 [2024-07-15 17:24:06.789787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:55.519 [2024-07-15 17:24:06.789820] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:55.519 [2024-07-15 17:24:06.789832] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1474e00 00:11:55.519 [2024-07-15 17:24:06.789838] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:55.519 [2024-07-15 17:24:06.790103] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:55.519 [2024-07-15 17:24:06.790114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:55.519 [2024-07-15 17:24:06.790155] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:55.519 [2024-07-15 17:24:06.790167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:55.519 [2024-07-15 17:24:06.790242] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1472820 00:11:55.519 [2024-07-15 17:24:06.790249] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:55.519 [2024-07-15 17:24:06.790384] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1621e90 00:11:55.519 [2024-07-15 17:24:06.790483] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1472820 00:11:55.519 [2024-07-15 17:24:06.790489] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1472820 00:11:55.519 [2024-07-15 17:24:06.790563] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:55.519 pt2 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.519 17:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:55.780 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.780 "name": "raid_bdev1", 00:11:55.780 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:11:55.780 "strip_size_kb": 0, 00:11:55.780 "state": "online", 00:11:55.780 "raid_level": "raid1", 00:11:55.780 "superblock": true, 00:11:55.780 "num_base_bdevs": 2, 00:11:55.780 "num_base_bdevs_discovered": 2, 00:11:55.780 "num_base_bdevs_operational": 2, 00:11:55.780 "base_bdevs_list": [ 00:11:55.780 { 00:11:55.780 "name": "pt1", 00:11:55.780 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:55.780 "is_configured": true, 00:11:55.780 "data_offset": 2048, 00:11:55.780 "data_size": 63488 00:11:55.780 }, 00:11:55.780 { 00:11:55.780 "name": "pt2", 00:11:55.780 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:55.780 "is_configured": true, 00:11:55.780 "data_offset": 2048, 00:11:55.780 "data_size": 63488 00:11:55.780 } 00:11:55.780 ] 00:11:55.780 }' 00:11:55.780 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.780 17:24:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.375 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:56.375 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:56.375 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:56.375 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:56.375 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:56.375 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:56.375 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:56.375 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:56.636 [2024-07-15 17:24:07.720351] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:56.636 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:56.636 "name": "raid_bdev1", 00:11:56.636 "aliases": [ 00:11:56.636 "773bdf98-240e-436c-bd47-592fae02c20c" 00:11:56.636 ], 00:11:56.636 "product_name": "Raid Volume", 00:11:56.636 "block_size": 512, 00:11:56.636 "num_blocks": 63488, 00:11:56.636 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:11:56.636 "assigned_rate_limits": { 00:11:56.636 "rw_ios_per_sec": 0, 00:11:56.636 "rw_mbytes_per_sec": 0, 00:11:56.636 "r_mbytes_per_sec": 0, 00:11:56.636 "w_mbytes_per_sec": 0 00:11:56.636 }, 00:11:56.636 "claimed": false, 00:11:56.636 "zoned": false, 00:11:56.636 "supported_io_types": { 00:11:56.636 "read": true, 00:11:56.636 "write": true, 00:11:56.636 "unmap": false, 00:11:56.636 "flush": false, 00:11:56.636 "reset": true, 00:11:56.636 "nvme_admin": false, 00:11:56.636 "nvme_io": false, 00:11:56.636 "nvme_io_md": false, 00:11:56.636 "write_zeroes": true, 00:11:56.636 "zcopy": false, 00:11:56.636 "get_zone_info": false, 00:11:56.636 "zone_management": false, 00:11:56.636 "zone_append": false, 00:11:56.636 "compare": false, 00:11:56.636 "compare_and_write": false, 00:11:56.636 "abort": false, 00:11:56.636 "seek_hole": false, 00:11:56.636 "seek_data": false, 00:11:56.636 "copy": false, 00:11:56.636 "nvme_iov_md": false 00:11:56.636 }, 00:11:56.636 "memory_domains": [ 00:11:56.636 { 00:11:56.636 "dma_device_id": "system", 00:11:56.636 "dma_device_type": 1 00:11:56.636 }, 00:11:56.636 { 00:11:56.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.636 "dma_device_type": 2 00:11:56.636 }, 00:11:56.636 { 00:11:56.636 "dma_device_id": "system", 00:11:56.636 "dma_device_type": 1 00:11:56.636 }, 00:11:56.636 { 00:11:56.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.636 "dma_device_type": 2 00:11:56.636 } 00:11:56.636 ], 00:11:56.636 "driver_specific": { 00:11:56.636 "raid": { 00:11:56.636 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:11:56.636 "strip_size_kb": 0, 00:11:56.636 "state": "online", 00:11:56.636 "raid_level": "raid1", 00:11:56.636 "superblock": true, 00:11:56.636 "num_base_bdevs": 2, 00:11:56.636 "num_base_bdevs_discovered": 2, 00:11:56.636 "num_base_bdevs_operational": 2, 00:11:56.636 "base_bdevs_list": [ 00:11:56.636 { 00:11:56.636 "name": "pt1", 00:11:56.636 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:56.636 "is_configured": true, 00:11:56.636 "data_offset": 2048, 00:11:56.636 "data_size": 63488 00:11:56.636 }, 00:11:56.636 { 00:11:56.636 "name": "pt2", 00:11:56.636 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:56.636 "is_configured": true, 00:11:56.636 "data_offset": 2048, 00:11:56.636 "data_size": 63488 00:11:56.636 } 00:11:56.636 ] 00:11:56.636 } 00:11:56.636 } 00:11:56.636 }' 00:11:56.636 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:56.636 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:56.636 pt2' 00:11:56.636 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:56.636 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:56.636 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:56.896 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:56.896 "name": "pt1", 00:11:56.896 "aliases": [ 00:11:56.896 "00000000-0000-0000-0000-000000000001" 00:11:56.896 ], 00:11:56.896 "product_name": "passthru", 00:11:56.896 "block_size": 512, 00:11:56.896 "num_blocks": 65536, 00:11:56.896 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:56.896 "assigned_rate_limits": { 00:11:56.896 "rw_ios_per_sec": 0, 00:11:56.896 "rw_mbytes_per_sec": 0, 00:11:56.896 "r_mbytes_per_sec": 0, 00:11:56.896 "w_mbytes_per_sec": 0 00:11:56.896 }, 00:11:56.896 "claimed": true, 00:11:56.896 "claim_type": "exclusive_write", 00:11:56.896 "zoned": false, 00:11:56.896 "supported_io_types": { 00:11:56.896 "read": true, 00:11:56.896 "write": true, 00:11:56.896 "unmap": true, 00:11:56.896 "flush": true, 00:11:56.896 "reset": true, 00:11:56.896 "nvme_admin": false, 00:11:56.896 "nvme_io": false, 00:11:56.896 "nvme_io_md": false, 00:11:56.896 "write_zeroes": true, 00:11:56.896 "zcopy": true, 00:11:56.896 "get_zone_info": false, 00:11:56.896 "zone_management": false, 00:11:56.896 "zone_append": false, 00:11:56.896 "compare": false, 00:11:56.896 "compare_and_write": false, 00:11:56.896 "abort": true, 00:11:56.896 "seek_hole": false, 00:11:56.896 "seek_data": false, 00:11:56.896 "copy": true, 00:11:56.896 "nvme_iov_md": false 00:11:56.896 }, 00:11:56.896 "memory_domains": [ 00:11:56.896 { 00:11:56.896 "dma_device_id": "system", 00:11:56.896 "dma_device_type": 1 00:11:56.896 }, 00:11:56.896 { 00:11:56.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.896 "dma_device_type": 2 00:11:56.896 } 00:11:56.896 ], 00:11:56.896 "driver_specific": { 00:11:56.896 "passthru": { 00:11:56.896 "name": "pt1", 00:11:56.896 "base_bdev_name": "malloc1" 00:11:56.896 } 00:11:56.896 } 00:11:56.896 }' 00:11:56.896 17:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:56.896 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:56.896 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:56.896 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:56.896 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:56.896 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:56.896 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:57.156 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:57.156 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:57.156 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:57.156 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:57.156 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:57.156 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:57.156 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:57.156 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:57.415 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:57.415 "name": "pt2", 00:11:57.415 "aliases": [ 00:11:57.415 "00000000-0000-0000-0000-000000000002" 00:11:57.415 ], 00:11:57.415 "product_name": "passthru", 00:11:57.415 "block_size": 512, 00:11:57.415 "num_blocks": 65536, 00:11:57.415 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:57.415 "assigned_rate_limits": { 00:11:57.415 "rw_ios_per_sec": 0, 00:11:57.415 "rw_mbytes_per_sec": 0, 00:11:57.415 "r_mbytes_per_sec": 0, 00:11:57.415 "w_mbytes_per_sec": 0 00:11:57.415 }, 00:11:57.415 "claimed": true, 00:11:57.415 "claim_type": "exclusive_write", 00:11:57.415 "zoned": false, 00:11:57.415 "supported_io_types": { 00:11:57.415 "read": true, 00:11:57.415 "write": true, 00:11:57.415 "unmap": true, 00:11:57.415 "flush": true, 00:11:57.415 "reset": true, 00:11:57.415 "nvme_admin": false, 00:11:57.415 "nvme_io": false, 00:11:57.415 "nvme_io_md": false, 00:11:57.415 "write_zeroes": true, 00:11:57.415 "zcopy": true, 00:11:57.415 "get_zone_info": false, 00:11:57.415 "zone_management": false, 00:11:57.415 "zone_append": false, 00:11:57.415 "compare": false, 00:11:57.415 "compare_and_write": false, 00:11:57.415 "abort": true, 00:11:57.415 "seek_hole": false, 00:11:57.415 "seek_data": false, 00:11:57.415 "copy": true, 00:11:57.415 "nvme_iov_md": false 00:11:57.415 }, 00:11:57.415 "memory_domains": [ 00:11:57.415 { 00:11:57.415 "dma_device_id": "system", 00:11:57.415 "dma_device_type": 1 00:11:57.415 }, 00:11:57.415 { 00:11:57.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.415 "dma_device_type": 2 00:11:57.415 } 00:11:57.415 ], 00:11:57.415 "driver_specific": { 00:11:57.415 "passthru": { 00:11:57.415 "name": "pt2", 00:11:57.415 "base_bdev_name": "malloc2" 00:11:57.415 } 00:11:57.415 } 00:11:57.415 }' 00:11:57.415 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:57.415 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:57.415 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:57.415 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:57.415 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:57.415 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:57.675 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:57.675 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:57.675 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:57.675 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:57.675 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:57.675 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:57.675 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:57.675 17:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:57.935 [2024-07-15 17:24:09.055740] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:57.935 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 773bdf98-240e-436c-bd47-592fae02c20c '!=' 773bdf98-240e-436c-bd47-592fae02c20c ']' 00:11:57.935 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:11:57.935 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:57.935 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:57.935 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:58.195 [2024-07-15 17:24:09.248029] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.195 "name": "raid_bdev1", 00:11:58.195 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:11:58.195 "strip_size_kb": 0, 00:11:58.195 "state": "online", 00:11:58.195 "raid_level": "raid1", 00:11:58.195 "superblock": true, 00:11:58.195 "num_base_bdevs": 2, 00:11:58.195 "num_base_bdevs_discovered": 1, 00:11:58.195 "num_base_bdevs_operational": 1, 00:11:58.195 "base_bdevs_list": [ 00:11:58.195 { 00:11:58.195 "name": null, 00:11:58.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:58.195 "is_configured": false, 00:11:58.195 "data_offset": 2048, 00:11:58.195 "data_size": 63488 00:11:58.195 }, 00:11:58.195 { 00:11:58.195 "name": "pt2", 00:11:58.195 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:58.195 "is_configured": true, 00:11:58.195 "data_offset": 2048, 00:11:58.195 "data_size": 63488 00:11:58.195 } 00:11:58.195 ] 00:11:58.195 }' 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.195 17:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.766 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:59.025 [2024-07-15 17:24:10.198415] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:59.026 [2024-07-15 17:24:10.198439] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:59.026 [2024-07-15 17:24:10.198480] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:59.026 [2024-07-15 17:24:10.198511] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:59.026 [2024-07-15 17:24:10.198518] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1472820 name raid_bdev1, state offline 00:11:59.026 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.026 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:11:59.285 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:11:59.285 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:11:59.285 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:11:59.285 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:59.285 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:59.544 [2024-07-15 17:24:10.775845] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:59.544 [2024-07-15 17:24:10.775880] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:59.544 [2024-07-15 17:24:10.775893] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161cd70 00:11:59.544 [2024-07-15 17:24:10.775900] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:59.544 [2024-07-15 17:24:10.777177] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:59.544 [2024-07-15 17:24:10.777197] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:59.544 [2024-07-15 17:24:10.777242] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:59.544 [2024-07-15 17:24:10.777261] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:59.544 [2024-07-15 17:24:10.777327] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1621ae0 00:11:59.544 [2024-07-15 17:24:10.777333] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:59.544 [2024-07-15 17:24:10.777473] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x161f160 00:11:59.544 [2024-07-15 17:24:10.777568] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1621ae0 00:11:59.544 [2024-07-15 17:24:10.777573] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1621ae0 00:11:59.544 [2024-07-15 17:24:10.777644] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:59.544 pt2 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.544 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:59.803 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.803 "name": "raid_bdev1", 00:11:59.803 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:11:59.803 "strip_size_kb": 0, 00:11:59.803 "state": "online", 00:11:59.803 "raid_level": "raid1", 00:11:59.803 "superblock": true, 00:11:59.803 "num_base_bdevs": 2, 00:11:59.803 "num_base_bdevs_discovered": 1, 00:11:59.803 "num_base_bdevs_operational": 1, 00:11:59.803 "base_bdevs_list": [ 00:11:59.803 { 00:11:59.803 "name": null, 00:11:59.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:59.803 "is_configured": false, 00:11:59.803 "data_offset": 2048, 00:11:59.803 "data_size": 63488 00:11:59.803 }, 00:11:59.803 { 00:11:59.803 "name": "pt2", 00:11:59.803 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:59.803 "is_configured": true, 00:11:59.803 "data_offset": 2048, 00:11:59.803 "data_size": 63488 00:11:59.803 } 00:11:59.803 ] 00:11:59.803 }' 00:11:59.803 17:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.803 17:24:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:00.370 17:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:00.630 [2024-07-15 17:24:11.710200] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:00.630 [2024-07-15 17:24:11.710219] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:00.630 [2024-07-15 17:24:11.710258] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:00.630 [2024-07-15 17:24:11.710288] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:00.630 [2024-07-15 17:24:11.710295] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1621ae0 name raid_bdev1, state offline 00:12:00.630 17:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.630 17:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:12:00.630 17:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:12:00.630 17:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:12:00.630 17:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:12:00.630 17:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:00.889 [2024-07-15 17:24:12.095166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:00.889 [2024-07-15 17:24:12.095200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:00.889 [2024-07-15 17:24:12.095211] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14743e0 00:12:00.889 [2024-07-15 17:24:12.095218] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:00.889 [2024-07-15 17:24:12.096497] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:00.889 [2024-07-15 17:24:12.096517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:00.889 [2024-07-15 17:24:12.096564] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:00.889 [2024-07-15 17:24:12.096582] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:00.889 [2024-07-15 17:24:12.096659] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:12:00.889 [2024-07-15 17:24:12.096667] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:00.889 [2024-07-15 17:24:12.096676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1621fa0 name raid_bdev1, state configuring 00:12:00.889 [2024-07-15 17:24:12.096690] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:00.889 [2024-07-15 17:24:12.096742] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1621fa0 00:12:00.889 [2024-07-15 17:24:12.096748] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:00.889 [2024-07-15 17:24:12.096893] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x161e850 00:12:00.889 [2024-07-15 17:24:12.096989] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1621fa0 00:12:00.889 [2024-07-15 17:24:12.096994] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1621fa0 00:12:00.889 [2024-07-15 17:24:12.097068] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:00.889 pt1 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.889 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:01.148 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:01.148 "name": "raid_bdev1", 00:12:01.148 "uuid": "773bdf98-240e-436c-bd47-592fae02c20c", 00:12:01.148 "strip_size_kb": 0, 00:12:01.148 "state": "online", 00:12:01.148 "raid_level": "raid1", 00:12:01.148 "superblock": true, 00:12:01.148 "num_base_bdevs": 2, 00:12:01.148 "num_base_bdevs_discovered": 1, 00:12:01.148 "num_base_bdevs_operational": 1, 00:12:01.148 "base_bdevs_list": [ 00:12:01.148 { 00:12:01.148 "name": null, 00:12:01.148 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:01.148 "is_configured": false, 00:12:01.148 "data_offset": 2048, 00:12:01.148 "data_size": 63488 00:12:01.148 }, 00:12:01.148 { 00:12:01.148 "name": "pt2", 00:12:01.148 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:01.148 "is_configured": true, 00:12:01.148 "data_offset": 2048, 00:12:01.148 "data_size": 63488 00:12:01.148 } 00:12:01.148 ] 00:12:01.148 }' 00:12:01.148 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:01.148 17:24:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.717 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:01.717 17:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:12:01.976 [2024-07-15 17:24:13.242243] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 773bdf98-240e-436c-bd47-592fae02c20c '!=' 773bdf98-240e-436c-bd47-592fae02c20c ']' 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2758019 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2758019 ']' 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2758019 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:01.976 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2758019 00:12:02.237 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:02.237 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:02.237 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2758019' 00:12:02.237 killing process with pid 2758019 00:12:02.237 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2758019 00:12:02.237 [2024-07-15 17:24:13.311098] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:02.237 [2024-07-15 17:24:13.311137] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:02.237 [2024-07-15 17:24:13.311167] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:02.237 [2024-07-15 17:24:13.311173] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1621fa0 name raid_bdev1, state offline 00:12:02.237 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2758019 00:12:02.237 [2024-07-15 17:24:13.320374] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:02.237 17:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:02.237 00:12:02.237 real 0m13.174s 00:12:02.237 user 0m24.472s 00:12:02.237 sys 0m1.949s 00:12:02.237 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:02.237 17:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.237 ************************************ 00:12:02.237 END TEST raid_superblock_test 00:12:02.237 ************************************ 00:12:02.237 17:24:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:02.237 17:24:13 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:12:02.237 17:24:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:02.237 17:24:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:02.237 17:24:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:02.237 ************************************ 00:12:02.237 START TEST raid_read_error_test 00:12:02.237 ************************************ 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.whYqXbWPAM 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2761257 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2761257 /var/tmp/spdk-raid.sock 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2761257 ']' 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:02.237 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:02.237 17:24:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.498 [2024-07-15 17:24:13.593491] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:12:02.498 [2024-07-15 17:24:13.593548] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2761257 ] 00:12:02.498 [2024-07-15 17:24:13.683798] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:02.498 [2024-07-15 17:24:13.751698] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:02.498 [2024-07-15 17:24:13.790730] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:02.498 [2024-07-15 17:24:13.790752] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:03.438 17:24:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:03.438 17:24:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:03.438 17:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:03.439 17:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:03.439 BaseBdev1_malloc 00:12:03.439 17:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:03.699 true 00:12:03.699 17:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:03.699 [2024-07-15 17:24:14.965230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:03.699 [2024-07-15 17:24:14.965260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:03.699 [2024-07-15 17:24:14.965273] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2529b50 00:12:03.699 [2024-07-15 17:24:14.965279] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:03.699 [2024-07-15 17:24:14.966580] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:03.699 [2024-07-15 17:24:14.966600] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:03.699 BaseBdev1 00:12:03.699 17:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:03.699 17:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:03.958 BaseBdev2_malloc 00:12:03.958 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:04.219 true 00:12:04.219 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:04.480 [2024-07-15 17:24:15.540673] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:04.480 [2024-07-15 17:24:15.540699] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:04.480 [2024-07-15 17:24:15.540715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x250dea0 00:12:04.480 [2024-07-15 17:24:15.540722] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:04.480 [2024-07-15 17:24:15.541900] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:04.480 [2024-07-15 17:24:15.541917] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:04.480 BaseBdev2 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:04.480 [2024-07-15 17:24:15.729169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:04.480 [2024-07-15 17:24:15.730182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:04.480 [2024-07-15 17:24:15.730323] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2377360 00:12:04.480 [2024-07-15 17:24:15.730331] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:04.480 [2024-07-15 17:24:15.730472] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2374a00 00:12:04.480 [2024-07-15 17:24:15.730587] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2377360 00:12:04.480 [2024-07-15 17:24:15.730592] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2377360 00:12:04.480 [2024-07-15 17:24:15.730667] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.480 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:04.740 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:04.740 "name": "raid_bdev1", 00:12:04.740 "uuid": "7d27a494-d0c4-4674-b1a3-3f91432f411c", 00:12:04.740 "strip_size_kb": 0, 00:12:04.740 "state": "online", 00:12:04.740 "raid_level": "raid1", 00:12:04.740 "superblock": true, 00:12:04.740 "num_base_bdevs": 2, 00:12:04.740 "num_base_bdevs_discovered": 2, 00:12:04.740 "num_base_bdevs_operational": 2, 00:12:04.740 "base_bdevs_list": [ 00:12:04.740 { 00:12:04.740 "name": "BaseBdev1", 00:12:04.740 "uuid": "98445dc3-6dc7-5ecf-8707-9677af1ef8f6", 00:12:04.740 "is_configured": true, 00:12:04.740 "data_offset": 2048, 00:12:04.740 "data_size": 63488 00:12:04.740 }, 00:12:04.740 { 00:12:04.740 "name": "BaseBdev2", 00:12:04.740 "uuid": "137addd4-960c-5e11-8371-e131650baae8", 00:12:04.740 "is_configured": true, 00:12:04.740 "data_offset": 2048, 00:12:04.740 "data_size": 63488 00:12:04.740 } 00:12:04.740 ] 00:12:04.740 }' 00:12:04.740 17:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:04.740 17:24:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.309 17:24:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:05.309 17:24:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:05.309 [2024-07-15 17:24:16.555469] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x250f2a0 00:12:06.249 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:06.508 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.509 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.509 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.509 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.509 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.509 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:06.767 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.767 "name": "raid_bdev1", 00:12:06.767 "uuid": "7d27a494-d0c4-4674-b1a3-3f91432f411c", 00:12:06.767 "strip_size_kb": 0, 00:12:06.767 "state": "online", 00:12:06.767 "raid_level": "raid1", 00:12:06.767 "superblock": true, 00:12:06.767 "num_base_bdevs": 2, 00:12:06.767 "num_base_bdevs_discovered": 2, 00:12:06.767 "num_base_bdevs_operational": 2, 00:12:06.767 "base_bdevs_list": [ 00:12:06.767 { 00:12:06.767 "name": "BaseBdev1", 00:12:06.767 "uuid": "98445dc3-6dc7-5ecf-8707-9677af1ef8f6", 00:12:06.767 "is_configured": true, 00:12:06.767 "data_offset": 2048, 00:12:06.767 "data_size": 63488 00:12:06.767 }, 00:12:06.767 { 00:12:06.767 "name": "BaseBdev2", 00:12:06.767 "uuid": "137addd4-960c-5e11-8371-e131650baae8", 00:12:06.767 "is_configured": true, 00:12:06.767 "data_offset": 2048, 00:12:06.767 "data_size": 63488 00:12:06.767 } 00:12:06.767 ] 00:12:06.767 }' 00:12:06.767 17:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.767 17:24:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.334 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:07.334 [2024-07-15 17:24:18.579559] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:07.334 [2024-07-15 17:24:18.579588] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:07.334 [2024-07-15 17:24:18.582161] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:07.334 [2024-07-15 17:24:18.582183] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:07.334 [2024-07-15 17:24:18.582246] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:07.334 [2024-07-15 17:24:18.582253] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2377360 name raid_bdev1, state offline 00:12:07.334 0 00:12:07.334 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2761257 00:12:07.334 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2761257 ']' 00:12:07.334 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2761257 00:12:07.334 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:07.334 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:07.334 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2761257 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2761257' 00:12:07.594 killing process with pid 2761257 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2761257 00:12:07.594 [2024-07-15 17:24:18.666532] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2761257 00:12:07.594 [2024-07-15 17:24:18.672445] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.whYqXbWPAM 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:07.594 00:12:07.594 real 0m5.289s 00:12:07.594 user 0m8.293s 00:12:07.594 sys 0m0.757s 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:07.594 17:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.594 ************************************ 00:12:07.594 END TEST raid_read_error_test 00:12:07.594 ************************************ 00:12:07.594 17:24:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:07.594 17:24:18 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:12:07.594 17:24:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:07.594 17:24:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:07.594 17:24:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:07.594 ************************************ 00:12:07.594 START TEST raid_write_error_test 00:12:07.594 ************************************ 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:07.594 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.T9qWqvvlxQ 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2762181 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2762181 /var/tmp/spdk-raid.sock 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2762181 ']' 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:07.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:07.853 17:24:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.853 [2024-07-15 17:24:18.945561] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:12:07.853 [2024-07-15 17:24:18.945615] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2762181 ] 00:12:07.853 [2024-07-15 17:24:19.036029] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:07.853 [2024-07-15 17:24:19.103368] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:08.112 [2024-07-15 17:24:19.159627] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:08.112 [2024-07-15 17:24:19.159653] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:08.682 17:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:08.682 17:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:08.682 17:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:08.682 17:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:08.682 BaseBdev1_malloc 00:12:08.682 17:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:08.941 true 00:12:08.941 17:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:09.201 [2024-07-15 17:24:20.322576] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:09.201 [2024-07-15 17:24:20.322607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.201 [2024-07-15 17:24:20.322618] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1af2b50 00:12:09.201 [2024-07-15 17:24:20.322625] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.201 [2024-07-15 17:24:20.324007] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.201 [2024-07-15 17:24:20.324026] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:09.201 BaseBdev1 00:12:09.201 17:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:09.201 17:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:09.460 BaseBdev2_malloc 00:12:09.460 17:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:09.460 true 00:12:09.460 17:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:09.719 [2024-07-15 17:24:20.893981] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:09.719 [2024-07-15 17:24:20.894007] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.719 [2024-07-15 17:24:20.894017] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad6ea0 00:12:09.719 [2024-07-15 17:24:20.894023] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.719 [2024-07-15 17:24:20.895261] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.719 [2024-07-15 17:24:20.895281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:09.719 BaseBdev2 00:12:09.719 17:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:09.978 [2024-07-15 17:24:21.086491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:09.978 [2024-07-15 17:24:21.087503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:09.978 [2024-07-15 17:24:21.087650] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1940360 00:12:09.978 [2024-07-15 17:24:21.087658] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:09.978 [2024-07-15 17:24:21.087807] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x193da00 00:12:09.978 [2024-07-15 17:24:21.087923] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1940360 00:12:09.978 [2024-07-15 17:24:21.087928] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1940360 00:12:09.978 [2024-07-15 17:24:21.088004] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.978 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:10.238 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:10.238 "name": "raid_bdev1", 00:12:10.238 "uuid": "b2d05a29-5b81-4f8a-a34c-f710bbb2653f", 00:12:10.238 "strip_size_kb": 0, 00:12:10.238 "state": "online", 00:12:10.238 "raid_level": "raid1", 00:12:10.238 "superblock": true, 00:12:10.238 "num_base_bdevs": 2, 00:12:10.238 "num_base_bdevs_discovered": 2, 00:12:10.238 "num_base_bdevs_operational": 2, 00:12:10.238 "base_bdevs_list": [ 00:12:10.238 { 00:12:10.238 "name": "BaseBdev1", 00:12:10.238 "uuid": "037d6bd4-565f-5a0a-b74b-14adb2377f57", 00:12:10.238 "is_configured": true, 00:12:10.238 "data_offset": 2048, 00:12:10.238 "data_size": 63488 00:12:10.238 }, 00:12:10.238 { 00:12:10.238 "name": "BaseBdev2", 00:12:10.238 "uuid": "4422908b-ba81-54b7-9e72-67eb7ef53523", 00:12:10.238 "is_configured": true, 00:12:10.238 "data_offset": 2048, 00:12:10.238 "data_size": 63488 00:12:10.238 } 00:12:10.238 ] 00:12:10.238 }' 00:12:10.238 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:10.238 17:24:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.828 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:10.828 17:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:10.828 [2024-07-15 17:24:21.948888] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ad82a0 00:12:11.787 17:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:11.787 [2024-07-15 17:24:23.041621] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:12:11.787 [2024-07-15 17:24:23.041667] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:11.787 [2024-07-15 17:24:23.041823] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1ad82a0 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.787 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:12.046 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.046 "name": "raid_bdev1", 00:12:12.046 "uuid": "b2d05a29-5b81-4f8a-a34c-f710bbb2653f", 00:12:12.046 "strip_size_kb": 0, 00:12:12.046 "state": "online", 00:12:12.046 "raid_level": "raid1", 00:12:12.046 "superblock": true, 00:12:12.046 "num_base_bdevs": 2, 00:12:12.046 "num_base_bdevs_discovered": 1, 00:12:12.046 "num_base_bdevs_operational": 1, 00:12:12.046 "base_bdevs_list": [ 00:12:12.046 { 00:12:12.046 "name": null, 00:12:12.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:12.046 "is_configured": false, 00:12:12.046 "data_offset": 2048, 00:12:12.046 "data_size": 63488 00:12:12.046 }, 00:12:12.046 { 00:12:12.046 "name": "BaseBdev2", 00:12:12.046 "uuid": "4422908b-ba81-54b7-9e72-67eb7ef53523", 00:12:12.046 "is_configured": true, 00:12:12.046 "data_offset": 2048, 00:12:12.046 "data_size": 63488 00:12:12.046 } 00:12:12.046 ] 00:12:12.046 }' 00:12:12.046 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.046 17:24:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.615 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:12.875 [2024-07-15 17:24:23.935742] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:12.875 [2024-07-15 17:24:23.935765] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:12.875 [2024-07-15 17:24:23.938307] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:12.875 [2024-07-15 17:24:23.938326] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:12.875 [2024-07-15 17:24:23.938362] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:12.875 [2024-07-15 17:24:23.938368] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1940360 name raid_bdev1, state offline 00:12:12.875 0 00:12:12.875 17:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2762181 00:12:12.875 17:24:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2762181 ']' 00:12:12.875 17:24:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2762181 00:12:12.875 17:24:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:12.875 17:24:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:12.875 17:24:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2762181 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2762181' 00:12:12.875 killing process with pid 2762181 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2762181 00:12:12.875 [2024-07-15 17:24:24.021050] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2762181 00:12:12.875 [2024-07-15 17:24:24.026675] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.T9qWqvvlxQ 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:12.875 17:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:12.876 00:12:12.876 real 0m5.281s 00:12:12.876 user 0m8.273s 00:12:12.876 sys 0m0.744s 00:12:12.876 17:24:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:12.876 17:24:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.876 ************************************ 00:12:12.876 END TEST raid_write_error_test 00:12:12.876 ************************************ 00:12:13.136 17:24:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:13.136 17:24:24 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:12:13.136 17:24:24 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:13.136 17:24:24 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:12:13.136 17:24:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:13.136 17:24:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:13.136 17:24:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:13.136 ************************************ 00:12:13.136 START TEST raid_state_function_test 00:12:13.136 ************************************ 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2763077 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2763077' 00:12:13.136 Process raid pid: 2763077 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2763077 /var/tmp/spdk-raid.sock 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2763077 ']' 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:13.136 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:13.136 17:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.136 [2024-07-15 17:24:24.298164] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:12:13.136 [2024-07-15 17:24:24.298217] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:13.136 [2024-07-15 17:24:24.388389] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:13.397 [2024-07-15 17:24:24.456985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:13.397 [2024-07-15 17:24:24.502887] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:13.397 [2024-07-15 17:24:24.502909] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:13.966 17:24:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:13.966 17:24:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:13.966 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:14.227 [2024-07-15 17:24:25.314114] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:14.227 [2024-07-15 17:24:25.314143] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:14.227 [2024-07-15 17:24:25.314149] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:14.227 [2024-07-15 17:24:25.314155] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:14.227 [2024-07-15 17:24:25.314160] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:14.227 [2024-07-15 17:24:25.314165] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.227 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.228 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.228 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:14.228 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.228 "name": "Existed_Raid", 00:12:14.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.228 "strip_size_kb": 64, 00:12:14.228 "state": "configuring", 00:12:14.228 "raid_level": "raid0", 00:12:14.228 "superblock": false, 00:12:14.228 "num_base_bdevs": 3, 00:12:14.228 "num_base_bdevs_discovered": 0, 00:12:14.228 "num_base_bdevs_operational": 3, 00:12:14.228 "base_bdevs_list": [ 00:12:14.228 { 00:12:14.228 "name": "BaseBdev1", 00:12:14.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.228 "is_configured": false, 00:12:14.228 "data_offset": 0, 00:12:14.228 "data_size": 0 00:12:14.228 }, 00:12:14.228 { 00:12:14.228 "name": "BaseBdev2", 00:12:14.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.228 "is_configured": false, 00:12:14.228 "data_offset": 0, 00:12:14.228 "data_size": 0 00:12:14.228 }, 00:12:14.228 { 00:12:14.228 "name": "BaseBdev3", 00:12:14.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.228 "is_configured": false, 00:12:14.228 "data_offset": 0, 00:12:14.228 "data_size": 0 00:12:14.228 } 00:12:14.228 ] 00:12:14.228 }' 00:12:14.228 17:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.228 17:24:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:14.799 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:15.060 [2024-07-15 17:24:26.236353] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:15.060 [2024-07-15 17:24:26.236369] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb3b6d0 name Existed_Raid, state configuring 00:12:15.060 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:15.320 [2024-07-15 17:24:26.424845] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:15.320 [2024-07-15 17:24:26.424864] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:15.320 [2024-07-15 17:24:26.424869] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:15.320 [2024-07-15 17:24:26.424875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:15.320 [2024-07-15 17:24:26.424879] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:15.320 [2024-07-15 17:24:26.424885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:15.320 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:15.580 [2024-07-15 17:24:26.619951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:15.580 BaseBdev1 00:12:15.580 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:15.580 17:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:15.580 17:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:15.580 17:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:15.580 17:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:15.580 17:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:15.580 17:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:15.580 17:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:15.841 [ 00:12:15.841 { 00:12:15.841 "name": "BaseBdev1", 00:12:15.841 "aliases": [ 00:12:15.841 "97a481ae-6087-4f57-8242-5b558cb9cea6" 00:12:15.841 ], 00:12:15.841 "product_name": "Malloc disk", 00:12:15.841 "block_size": 512, 00:12:15.841 "num_blocks": 65536, 00:12:15.841 "uuid": "97a481ae-6087-4f57-8242-5b558cb9cea6", 00:12:15.841 "assigned_rate_limits": { 00:12:15.841 "rw_ios_per_sec": 0, 00:12:15.841 "rw_mbytes_per_sec": 0, 00:12:15.841 "r_mbytes_per_sec": 0, 00:12:15.841 "w_mbytes_per_sec": 0 00:12:15.841 }, 00:12:15.841 "claimed": true, 00:12:15.841 "claim_type": "exclusive_write", 00:12:15.841 "zoned": false, 00:12:15.841 "supported_io_types": { 00:12:15.841 "read": true, 00:12:15.841 "write": true, 00:12:15.841 "unmap": true, 00:12:15.841 "flush": true, 00:12:15.841 "reset": true, 00:12:15.841 "nvme_admin": false, 00:12:15.841 "nvme_io": false, 00:12:15.841 "nvme_io_md": false, 00:12:15.841 "write_zeroes": true, 00:12:15.841 "zcopy": true, 00:12:15.841 "get_zone_info": false, 00:12:15.841 "zone_management": false, 00:12:15.841 "zone_append": false, 00:12:15.841 "compare": false, 00:12:15.841 "compare_and_write": false, 00:12:15.841 "abort": true, 00:12:15.841 "seek_hole": false, 00:12:15.841 "seek_data": false, 00:12:15.841 "copy": true, 00:12:15.841 "nvme_iov_md": false 00:12:15.841 }, 00:12:15.841 "memory_domains": [ 00:12:15.841 { 00:12:15.841 "dma_device_id": "system", 00:12:15.841 "dma_device_type": 1 00:12:15.841 }, 00:12:15.841 { 00:12:15.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.841 "dma_device_type": 2 00:12:15.841 } 00:12:15.841 ], 00:12:15.841 "driver_specific": {} 00:12:15.841 } 00:12:15.841 ] 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.841 17:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:16.103 17:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:16.103 "name": "Existed_Raid", 00:12:16.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.103 "strip_size_kb": 64, 00:12:16.103 "state": "configuring", 00:12:16.103 "raid_level": "raid0", 00:12:16.103 "superblock": false, 00:12:16.103 "num_base_bdevs": 3, 00:12:16.103 "num_base_bdevs_discovered": 1, 00:12:16.103 "num_base_bdevs_operational": 3, 00:12:16.103 "base_bdevs_list": [ 00:12:16.103 { 00:12:16.103 "name": "BaseBdev1", 00:12:16.103 "uuid": "97a481ae-6087-4f57-8242-5b558cb9cea6", 00:12:16.103 "is_configured": true, 00:12:16.103 "data_offset": 0, 00:12:16.103 "data_size": 65536 00:12:16.103 }, 00:12:16.103 { 00:12:16.103 "name": "BaseBdev2", 00:12:16.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.103 "is_configured": false, 00:12:16.103 "data_offset": 0, 00:12:16.103 "data_size": 0 00:12:16.103 }, 00:12:16.103 { 00:12:16.103 "name": "BaseBdev3", 00:12:16.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.103 "is_configured": false, 00:12:16.103 "data_offset": 0, 00:12:16.103 "data_size": 0 00:12:16.103 } 00:12:16.103 ] 00:12:16.103 }' 00:12:16.103 17:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:16.103 17:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.674 17:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:16.674 [2024-07-15 17:24:27.927247] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:16.674 [2024-07-15 17:24:27.927272] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb3afa0 name Existed_Raid, state configuring 00:12:16.674 17:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:16.935 [2024-07-15 17:24:28.123770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:16.935 [2024-07-15 17:24:28.124887] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:16.935 [2024-07-15 17:24:28.124912] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:16.935 [2024-07-15 17:24:28.124922] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:16.935 [2024-07-15 17:24:28.124928] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.935 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:17.196 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.196 "name": "Existed_Raid", 00:12:17.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.196 "strip_size_kb": 64, 00:12:17.196 "state": "configuring", 00:12:17.196 "raid_level": "raid0", 00:12:17.196 "superblock": false, 00:12:17.196 "num_base_bdevs": 3, 00:12:17.196 "num_base_bdevs_discovered": 1, 00:12:17.196 "num_base_bdevs_operational": 3, 00:12:17.196 "base_bdevs_list": [ 00:12:17.196 { 00:12:17.196 "name": "BaseBdev1", 00:12:17.196 "uuid": "97a481ae-6087-4f57-8242-5b558cb9cea6", 00:12:17.196 "is_configured": true, 00:12:17.196 "data_offset": 0, 00:12:17.196 "data_size": 65536 00:12:17.196 }, 00:12:17.196 { 00:12:17.196 "name": "BaseBdev2", 00:12:17.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.196 "is_configured": false, 00:12:17.196 "data_offset": 0, 00:12:17.196 "data_size": 0 00:12:17.196 }, 00:12:17.196 { 00:12:17.196 "name": "BaseBdev3", 00:12:17.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.196 "is_configured": false, 00:12:17.196 "data_offset": 0, 00:12:17.196 "data_size": 0 00:12:17.196 } 00:12:17.196 ] 00:12:17.196 }' 00:12:17.196 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.196 17:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.769 17:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:18.028 [2024-07-15 17:24:29.078872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:18.028 BaseBdev2 00:12:18.028 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:18.028 17:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:18.028 17:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:18.028 17:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:18.028 17:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:18.028 17:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:18.028 17:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:18.028 17:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:18.288 [ 00:12:18.288 { 00:12:18.288 "name": "BaseBdev2", 00:12:18.288 "aliases": [ 00:12:18.288 "723b3e85-0958-4453-92d0-682bf6599941" 00:12:18.288 ], 00:12:18.288 "product_name": "Malloc disk", 00:12:18.288 "block_size": 512, 00:12:18.288 "num_blocks": 65536, 00:12:18.288 "uuid": "723b3e85-0958-4453-92d0-682bf6599941", 00:12:18.288 "assigned_rate_limits": { 00:12:18.288 "rw_ios_per_sec": 0, 00:12:18.288 "rw_mbytes_per_sec": 0, 00:12:18.288 "r_mbytes_per_sec": 0, 00:12:18.288 "w_mbytes_per_sec": 0 00:12:18.288 }, 00:12:18.288 "claimed": true, 00:12:18.288 "claim_type": "exclusive_write", 00:12:18.288 "zoned": false, 00:12:18.288 "supported_io_types": { 00:12:18.288 "read": true, 00:12:18.288 "write": true, 00:12:18.288 "unmap": true, 00:12:18.288 "flush": true, 00:12:18.288 "reset": true, 00:12:18.288 "nvme_admin": false, 00:12:18.288 "nvme_io": false, 00:12:18.288 "nvme_io_md": false, 00:12:18.288 "write_zeroes": true, 00:12:18.288 "zcopy": true, 00:12:18.288 "get_zone_info": false, 00:12:18.288 "zone_management": false, 00:12:18.288 "zone_append": false, 00:12:18.288 "compare": false, 00:12:18.288 "compare_and_write": false, 00:12:18.288 "abort": true, 00:12:18.288 "seek_hole": false, 00:12:18.288 "seek_data": false, 00:12:18.288 "copy": true, 00:12:18.288 "nvme_iov_md": false 00:12:18.288 }, 00:12:18.288 "memory_domains": [ 00:12:18.288 { 00:12:18.288 "dma_device_id": "system", 00:12:18.288 "dma_device_type": 1 00:12:18.288 }, 00:12:18.288 { 00:12:18.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.288 "dma_device_type": 2 00:12:18.288 } 00:12:18.288 ], 00:12:18.288 "driver_specific": {} 00:12:18.288 } 00:12:18.288 ] 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.288 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.547 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.547 "name": "Existed_Raid", 00:12:18.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.547 "strip_size_kb": 64, 00:12:18.547 "state": "configuring", 00:12:18.547 "raid_level": "raid0", 00:12:18.547 "superblock": false, 00:12:18.547 "num_base_bdevs": 3, 00:12:18.547 "num_base_bdevs_discovered": 2, 00:12:18.547 "num_base_bdevs_operational": 3, 00:12:18.547 "base_bdevs_list": [ 00:12:18.547 { 00:12:18.547 "name": "BaseBdev1", 00:12:18.547 "uuid": "97a481ae-6087-4f57-8242-5b558cb9cea6", 00:12:18.547 "is_configured": true, 00:12:18.547 "data_offset": 0, 00:12:18.547 "data_size": 65536 00:12:18.547 }, 00:12:18.547 { 00:12:18.547 "name": "BaseBdev2", 00:12:18.547 "uuid": "723b3e85-0958-4453-92d0-682bf6599941", 00:12:18.547 "is_configured": true, 00:12:18.547 "data_offset": 0, 00:12:18.547 "data_size": 65536 00:12:18.547 }, 00:12:18.547 { 00:12:18.547 "name": "BaseBdev3", 00:12:18.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.547 "is_configured": false, 00:12:18.547 "data_offset": 0, 00:12:18.547 "data_size": 0 00:12:18.547 } 00:12:18.547 ] 00:12:18.547 }' 00:12:18.547 17:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.547 17:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.116 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:19.116 [2024-07-15 17:24:30.346810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:19.116 [2024-07-15 17:24:30.346835] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb3be90 00:12:19.116 [2024-07-15 17:24:30.346840] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:19.116 [2024-07-15 17:24:30.346980] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb3bb60 00:12:19.116 [2024-07-15 17:24:30.347074] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb3be90 00:12:19.116 [2024-07-15 17:24:30.347079] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb3be90 00:12:19.116 [2024-07-15 17:24:30.347192] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:19.116 BaseBdev3 00:12:19.116 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:19.116 17:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:19.116 17:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:19.116 17:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:19.116 17:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:19.116 17:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:19.116 17:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:19.375 17:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:19.652 [ 00:12:19.652 { 00:12:19.652 "name": "BaseBdev3", 00:12:19.652 "aliases": [ 00:12:19.652 "3951de5e-c7c3-4c98-986d-806640f3e062" 00:12:19.652 ], 00:12:19.652 "product_name": "Malloc disk", 00:12:19.652 "block_size": 512, 00:12:19.652 "num_blocks": 65536, 00:12:19.652 "uuid": "3951de5e-c7c3-4c98-986d-806640f3e062", 00:12:19.652 "assigned_rate_limits": { 00:12:19.652 "rw_ios_per_sec": 0, 00:12:19.652 "rw_mbytes_per_sec": 0, 00:12:19.652 "r_mbytes_per_sec": 0, 00:12:19.652 "w_mbytes_per_sec": 0 00:12:19.652 }, 00:12:19.652 "claimed": true, 00:12:19.652 "claim_type": "exclusive_write", 00:12:19.652 "zoned": false, 00:12:19.652 "supported_io_types": { 00:12:19.652 "read": true, 00:12:19.652 "write": true, 00:12:19.652 "unmap": true, 00:12:19.652 "flush": true, 00:12:19.652 "reset": true, 00:12:19.652 "nvme_admin": false, 00:12:19.652 "nvme_io": false, 00:12:19.652 "nvme_io_md": false, 00:12:19.652 "write_zeroes": true, 00:12:19.652 "zcopy": true, 00:12:19.652 "get_zone_info": false, 00:12:19.652 "zone_management": false, 00:12:19.652 "zone_append": false, 00:12:19.652 "compare": false, 00:12:19.652 "compare_and_write": false, 00:12:19.652 "abort": true, 00:12:19.652 "seek_hole": false, 00:12:19.652 "seek_data": false, 00:12:19.652 "copy": true, 00:12:19.652 "nvme_iov_md": false 00:12:19.652 }, 00:12:19.652 "memory_domains": [ 00:12:19.652 { 00:12:19.652 "dma_device_id": "system", 00:12:19.652 "dma_device_type": 1 00:12:19.652 }, 00:12:19.652 { 00:12:19.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.652 "dma_device_type": 2 00:12:19.652 } 00:12:19.652 ], 00:12:19.652 "driver_specific": {} 00:12:19.652 } 00:12:19.652 ] 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.652 "name": "Existed_Raid", 00:12:19.652 "uuid": "764b1e1a-6e84-489e-80e4-bfdc26b2f73c", 00:12:19.652 "strip_size_kb": 64, 00:12:19.652 "state": "online", 00:12:19.652 "raid_level": "raid0", 00:12:19.652 "superblock": false, 00:12:19.652 "num_base_bdevs": 3, 00:12:19.652 "num_base_bdevs_discovered": 3, 00:12:19.652 "num_base_bdevs_operational": 3, 00:12:19.652 "base_bdevs_list": [ 00:12:19.652 { 00:12:19.652 "name": "BaseBdev1", 00:12:19.652 "uuid": "97a481ae-6087-4f57-8242-5b558cb9cea6", 00:12:19.652 "is_configured": true, 00:12:19.652 "data_offset": 0, 00:12:19.652 "data_size": 65536 00:12:19.652 }, 00:12:19.652 { 00:12:19.652 "name": "BaseBdev2", 00:12:19.652 "uuid": "723b3e85-0958-4453-92d0-682bf6599941", 00:12:19.652 "is_configured": true, 00:12:19.652 "data_offset": 0, 00:12:19.652 "data_size": 65536 00:12:19.652 }, 00:12:19.652 { 00:12:19.652 "name": "BaseBdev3", 00:12:19.652 "uuid": "3951de5e-c7c3-4c98-986d-806640f3e062", 00:12:19.652 "is_configured": true, 00:12:19.652 "data_offset": 0, 00:12:19.652 "data_size": 65536 00:12:19.652 } 00:12:19.652 ] 00:12:19.652 }' 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.652 17:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.221 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:20.221 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:20.221 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:20.221 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:20.221 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:20.221 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:20.221 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:20.221 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:20.479 [2024-07-15 17:24:31.602219] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:20.480 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:20.480 "name": "Existed_Raid", 00:12:20.480 "aliases": [ 00:12:20.480 "764b1e1a-6e84-489e-80e4-bfdc26b2f73c" 00:12:20.480 ], 00:12:20.480 "product_name": "Raid Volume", 00:12:20.480 "block_size": 512, 00:12:20.480 "num_blocks": 196608, 00:12:20.480 "uuid": "764b1e1a-6e84-489e-80e4-bfdc26b2f73c", 00:12:20.480 "assigned_rate_limits": { 00:12:20.480 "rw_ios_per_sec": 0, 00:12:20.480 "rw_mbytes_per_sec": 0, 00:12:20.480 "r_mbytes_per_sec": 0, 00:12:20.480 "w_mbytes_per_sec": 0 00:12:20.480 }, 00:12:20.480 "claimed": false, 00:12:20.480 "zoned": false, 00:12:20.480 "supported_io_types": { 00:12:20.480 "read": true, 00:12:20.480 "write": true, 00:12:20.480 "unmap": true, 00:12:20.480 "flush": true, 00:12:20.480 "reset": true, 00:12:20.480 "nvme_admin": false, 00:12:20.480 "nvme_io": false, 00:12:20.480 "nvme_io_md": false, 00:12:20.480 "write_zeroes": true, 00:12:20.480 "zcopy": false, 00:12:20.480 "get_zone_info": false, 00:12:20.480 "zone_management": false, 00:12:20.480 "zone_append": false, 00:12:20.480 "compare": false, 00:12:20.480 "compare_and_write": false, 00:12:20.480 "abort": false, 00:12:20.480 "seek_hole": false, 00:12:20.480 "seek_data": false, 00:12:20.480 "copy": false, 00:12:20.480 "nvme_iov_md": false 00:12:20.480 }, 00:12:20.480 "memory_domains": [ 00:12:20.480 { 00:12:20.480 "dma_device_id": "system", 00:12:20.480 "dma_device_type": 1 00:12:20.480 }, 00:12:20.480 { 00:12:20.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.480 "dma_device_type": 2 00:12:20.480 }, 00:12:20.480 { 00:12:20.480 "dma_device_id": "system", 00:12:20.480 "dma_device_type": 1 00:12:20.480 }, 00:12:20.480 { 00:12:20.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.480 "dma_device_type": 2 00:12:20.480 }, 00:12:20.480 { 00:12:20.480 "dma_device_id": "system", 00:12:20.480 "dma_device_type": 1 00:12:20.480 }, 00:12:20.480 { 00:12:20.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.480 "dma_device_type": 2 00:12:20.480 } 00:12:20.480 ], 00:12:20.480 "driver_specific": { 00:12:20.480 "raid": { 00:12:20.480 "uuid": "764b1e1a-6e84-489e-80e4-bfdc26b2f73c", 00:12:20.480 "strip_size_kb": 64, 00:12:20.480 "state": "online", 00:12:20.480 "raid_level": "raid0", 00:12:20.480 "superblock": false, 00:12:20.480 "num_base_bdevs": 3, 00:12:20.480 "num_base_bdevs_discovered": 3, 00:12:20.480 "num_base_bdevs_operational": 3, 00:12:20.480 "base_bdevs_list": [ 00:12:20.480 { 00:12:20.480 "name": "BaseBdev1", 00:12:20.480 "uuid": "97a481ae-6087-4f57-8242-5b558cb9cea6", 00:12:20.480 "is_configured": true, 00:12:20.480 "data_offset": 0, 00:12:20.480 "data_size": 65536 00:12:20.480 }, 00:12:20.480 { 00:12:20.480 "name": "BaseBdev2", 00:12:20.480 "uuid": "723b3e85-0958-4453-92d0-682bf6599941", 00:12:20.480 "is_configured": true, 00:12:20.480 "data_offset": 0, 00:12:20.480 "data_size": 65536 00:12:20.480 }, 00:12:20.480 { 00:12:20.480 "name": "BaseBdev3", 00:12:20.480 "uuid": "3951de5e-c7c3-4c98-986d-806640f3e062", 00:12:20.480 "is_configured": true, 00:12:20.480 "data_offset": 0, 00:12:20.480 "data_size": 65536 00:12:20.480 } 00:12:20.480 ] 00:12:20.480 } 00:12:20.480 } 00:12:20.480 }' 00:12:20.480 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:20.480 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:20.480 BaseBdev2 00:12:20.480 BaseBdev3' 00:12:20.480 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:20.480 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:20.480 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:20.739 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:20.739 "name": "BaseBdev1", 00:12:20.739 "aliases": [ 00:12:20.739 "97a481ae-6087-4f57-8242-5b558cb9cea6" 00:12:20.739 ], 00:12:20.739 "product_name": "Malloc disk", 00:12:20.739 "block_size": 512, 00:12:20.739 "num_blocks": 65536, 00:12:20.739 "uuid": "97a481ae-6087-4f57-8242-5b558cb9cea6", 00:12:20.739 "assigned_rate_limits": { 00:12:20.739 "rw_ios_per_sec": 0, 00:12:20.739 "rw_mbytes_per_sec": 0, 00:12:20.739 "r_mbytes_per_sec": 0, 00:12:20.739 "w_mbytes_per_sec": 0 00:12:20.739 }, 00:12:20.739 "claimed": true, 00:12:20.739 "claim_type": "exclusive_write", 00:12:20.739 "zoned": false, 00:12:20.739 "supported_io_types": { 00:12:20.739 "read": true, 00:12:20.739 "write": true, 00:12:20.739 "unmap": true, 00:12:20.739 "flush": true, 00:12:20.739 "reset": true, 00:12:20.739 "nvme_admin": false, 00:12:20.739 "nvme_io": false, 00:12:20.739 "nvme_io_md": false, 00:12:20.739 "write_zeroes": true, 00:12:20.739 "zcopy": true, 00:12:20.739 "get_zone_info": false, 00:12:20.739 "zone_management": false, 00:12:20.739 "zone_append": false, 00:12:20.739 "compare": false, 00:12:20.739 "compare_and_write": false, 00:12:20.739 "abort": true, 00:12:20.739 "seek_hole": false, 00:12:20.739 "seek_data": false, 00:12:20.739 "copy": true, 00:12:20.739 "nvme_iov_md": false 00:12:20.739 }, 00:12:20.739 "memory_domains": [ 00:12:20.739 { 00:12:20.739 "dma_device_id": "system", 00:12:20.739 "dma_device_type": 1 00:12:20.739 }, 00:12:20.739 { 00:12:20.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.739 "dma_device_type": 2 00:12:20.739 } 00:12:20.739 ], 00:12:20.739 "driver_specific": {} 00:12:20.739 }' 00:12:20.739 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:20.739 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:20.739 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:20.739 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:20.739 17:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:20.739 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:20.739 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:20.998 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:20.998 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:20.998 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:20.998 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:20.998 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:20.998 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:20.998 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:20.998 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:21.259 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:21.259 "name": "BaseBdev2", 00:12:21.259 "aliases": [ 00:12:21.259 "723b3e85-0958-4453-92d0-682bf6599941" 00:12:21.259 ], 00:12:21.259 "product_name": "Malloc disk", 00:12:21.259 "block_size": 512, 00:12:21.259 "num_blocks": 65536, 00:12:21.259 "uuid": "723b3e85-0958-4453-92d0-682bf6599941", 00:12:21.259 "assigned_rate_limits": { 00:12:21.259 "rw_ios_per_sec": 0, 00:12:21.259 "rw_mbytes_per_sec": 0, 00:12:21.259 "r_mbytes_per_sec": 0, 00:12:21.259 "w_mbytes_per_sec": 0 00:12:21.259 }, 00:12:21.259 "claimed": true, 00:12:21.259 "claim_type": "exclusive_write", 00:12:21.259 "zoned": false, 00:12:21.259 "supported_io_types": { 00:12:21.259 "read": true, 00:12:21.259 "write": true, 00:12:21.259 "unmap": true, 00:12:21.259 "flush": true, 00:12:21.259 "reset": true, 00:12:21.259 "nvme_admin": false, 00:12:21.259 "nvme_io": false, 00:12:21.259 "nvme_io_md": false, 00:12:21.259 "write_zeroes": true, 00:12:21.259 "zcopy": true, 00:12:21.259 "get_zone_info": false, 00:12:21.259 "zone_management": false, 00:12:21.259 "zone_append": false, 00:12:21.259 "compare": false, 00:12:21.259 "compare_and_write": false, 00:12:21.259 "abort": true, 00:12:21.259 "seek_hole": false, 00:12:21.259 "seek_data": false, 00:12:21.259 "copy": true, 00:12:21.259 "nvme_iov_md": false 00:12:21.259 }, 00:12:21.259 "memory_domains": [ 00:12:21.259 { 00:12:21.259 "dma_device_id": "system", 00:12:21.259 "dma_device_type": 1 00:12:21.259 }, 00:12:21.259 { 00:12:21.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.259 "dma_device_type": 2 00:12:21.259 } 00:12:21.259 ], 00:12:21.259 "driver_specific": {} 00:12:21.259 }' 00:12:21.259 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.259 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.259 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:21.259 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.259 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:21.519 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:21.780 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:21.780 "name": "BaseBdev3", 00:12:21.780 "aliases": [ 00:12:21.780 "3951de5e-c7c3-4c98-986d-806640f3e062" 00:12:21.780 ], 00:12:21.780 "product_name": "Malloc disk", 00:12:21.780 "block_size": 512, 00:12:21.780 "num_blocks": 65536, 00:12:21.780 "uuid": "3951de5e-c7c3-4c98-986d-806640f3e062", 00:12:21.780 "assigned_rate_limits": { 00:12:21.780 "rw_ios_per_sec": 0, 00:12:21.780 "rw_mbytes_per_sec": 0, 00:12:21.780 "r_mbytes_per_sec": 0, 00:12:21.780 "w_mbytes_per_sec": 0 00:12:21.780 }, 00:12:21.780 "claimed": true, 00:12:21.780 "claim_type": "exclusive_write", 00:12:21.780 "zoned": false, 00:12:21.780 "supported_io_types": { 00:12:21.780 "read": true, 00:12:21.780 "write": true, 00:12:21.780 "unmap": true, 00:12:21.780 "flush": true, 00:12:21.780 "reset": true, 00:12:21.780 "nvme_admin": false, 00:12:21.780 "nvme_io": false, 00:12:21.780 "nvme_io_md": false, 00:12:21.780 "write_zeroes": true, 00:12:21.780 "zcopy": true, 00:12:21.780 "get_zone_info": false, 00:12:21.780 "zone_management": false, 00:12:21.780 "zone_append": false, 00:12:21.780 "compare": false, 00:12:21.780 "compare_and_write": false, 00:12:21.780 "abort": true, 00:12:21.780 "seek_hole": false, 00:12:21.780 "seek_data": false, 00:12:21.780 "copy": true, 00:12:21.780 "nvme_iov_md": false 00:12:21.780 }, 00:12:21.780 "memory_domains": [ 00:12:21.780 { 00:12:21.780 "dma_device_id": "system", 00:12:21.780 "dma_device_type": 1 00:12:21.780 }, 00:12:21.780 { 00:12:21.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.780 "dma_device_type": 2 00:12:21.780 } 00:12:21.780 ], 00:12:21.780 "driver_specific": {} 00:12:21.780 }' 00:12:21.780 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.780 17:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.780 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:21.780 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.780 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:22.041 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:22.041 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:22.041 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:22.041 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:22.041 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:22.041 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:22.041 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:22.041 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:22.301 [2024-07-15 17:24:33.466732] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:22.301 [2024-07-15 17:24:33.466748] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:22.301 [2024-07-15 17:24:33.466776] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.301 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.562 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.562 "name": "Existed_Raid", 00:12:22.562 "uuid": "764b1e1a-6e84-489e-80e4-bfdc26b2f73c", 00:12:22.562 "strip_size_kb": 64, 00:12:22.562 "state": "offline", 00:12:22.562 "raid_level": "raid0", 00:12:22.562 "superblock": false, 00:12:22.562 "num_base_bdevs": 3, 00:12:22.562 "num_base_bdevs_discovered": 2, 00:12:22.562 "num_base_bdevs_operational": 2, 00:12:22.562 "base_bdevs_list": [ 00:12:22.562 { 00:12:22.562 "name": null, 00:12:22.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.562 "is_configured": false, 00:12:22.562 "data_offset": 0, 00:12:22.562 "data_size": 65536 00:12:22.562 }, 00:12:22.562 { 00:12:22.562 "name": "BaseBdev2", 00:12:22.562 "uuid": "723b3e85-0958-4453-92d0-682bf6599941", 00:12:22.562 "is_configured": true, 00:12:22.562 "data_offset": 0, 00:12:22.562 "data_size": 65536 00:12:22.562 }, 00:12:22.562 { 00:12:22.562 "name": "BaseBdev3", 00:12:22.562 "uuid": "3951de5e-c7c3-4c98-986d-806640f3e062", 00:12:22.562 "is_configured": true, 00:12:22.562 "data_offset": 0, 00:12:22.562 "data_size": 65536 00:12:22.562 } 00:12:22.562 ] 00:12:22.562 }' 00:12:22.562 17:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.562 17:24:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.133 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:23.133 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:23.133 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.133 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:23.133 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:23.133 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:23.133 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:23.393 [2024-07-15 17:24:34.585575] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:23.393 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:23.393 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:23.393 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.393 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:23.653 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:23.653 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:23.653 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:23.913 [2024-07-15 17:24:34.960332] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:23.913 [2024-07-15 17:24:34.960358] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb3be90 name Existed_Raid, state offline 00:12:23.913 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:23.913 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:23.913 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.913 17:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:23.913 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:23.913 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:23.913 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:23.913 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:23.913 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:23.913 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:24.173 BaseBdev2 00:12:24.173 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:24.173 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:24.173 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:24.173 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:24.173 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:24.173 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:24.173 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:24.434 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:24.697 [ 00:12:24.697 { 00:12:24.697 "name": "BaseBdev2", 00:12:24.697 "aliases": [ 00:12:24.697 "e60e78a4-55d3-4aa7-8695-e52dc0449b61" 00:12:24.697 ], 00:12:24.697 "product_name": "Malloc disk", 00:12:24.697 "block_size": 512, 00:12:24.697 "num_blocks": 65536, 00:12:24.697 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:24.697 "assigned_rate_limits": { 00:12:24.697 "rw_ios_per_sec": 0, 00:12:24.697 "rw_mbytes_per_sec": 0, 00:12:24.697 "r_mbytes_per_sec": 0, 00:12:24.697 "w_mbytes_per_sec": 0 00:12:24.697 }, 00:12:24.697 "claimed": false, 00:12:24.697 "zoned": false, 00:12:24.697 "supported_io_types": { 00:12:24.697 "read": true, 00:12:24.697 "write": true, 00:12:24.697 "unmap": true, 00:12:24.697 "flush": true, 00:12:24.697 "reset": true, 00:12:24.697 "nvme_admin": false, 00:12:24.697 "nvme_io": false, 00:12:24.697 "nvme_io_md": false, 00:12:24.697 "write_zeroes": true, 00:12:24.697 "zcopy": true, 00:12:24.697 "get_zone_info": false, 00:12:24.697 "zone_management": false, 00:12:24.697 "zone_append": false, 00:12:24.697 "compare": false, 00:12:24.697 "compare_and_write": false, 00:12:24.697 "abort": true, 00:12:24.697 "seek_hole": false, 00:12:24.697 "seek_data": false, 00:12:24.697 "copy": true, 00:12:24.697 "nvme_iov_md": false 00:12:24.697 }, 00:12:24.697 "memory_domains": [ 00:12:24.697 { 00:12:24.697 "dma_device_id": "system", 00:12:24.697 "dma_device_type": 1 00:12:24.697 }, 00:12:24.697 { 00:12:24.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.697 "dma_device_type": 2 00:12:24.697 } 00:12:24.697 ], 00:12:24.697 "driver_specific": {} 00:12:24.697 } 00:12:24.697 ] 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:24.697 BaseBdev3 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:24.697 17:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:24.958 17:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:25.263 [ 00:12:25.263 { 00:12:25.263 "name": "BaseBdev3", 00:12:25.263 "aliases": [ 00:12:25.263 "f786c743-01ac-45d5-9875-0eefb9222438" 00:12:25.263 ], 00:12:25.263 "product_name": "Malloc disk", 00:12:25.263 "block_size": 512, 00:12:25.263 "num_blocks": 65536, 00:12:25.263 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:25.263 "assigned_rate_limits": { 00:12:25.263 "rw_ios_per_sec": 0, 00:12:25.263 "rw_mbytes_per_sec": 0, 00:12:25.263 "r_mbytes_per_sec": 0, 00:12:25.263 "w_mbytes_per_sec": 0 00:12:25.263 }, 00:12:25.263 "claimed": false, 00:12:25.263 "zoned": false, 00:12:25.263 "supported_io_types": { 00:12:25.263 "read": true, 00:12:25.263 "write": true, 00:12:25.263 "unmap": true, 00:12:25.263 "flush": true, 00:12:25.263 "reset": true, 00:12:25.263 "nvme_admin": false, 00:12:25.263 "nvme_io": false, 00:12:25.263 "nvme_io_md": false, 00:12:25.263 "write_zeroes": true, 00:12:25.263 "zcopy": true, 00:12:25.263 "get_zone_info": false, 00:12:25.263 "zone_management": false, 00:12:25.263 "zone_append": false, 00:12:25.263 "compare": false, 00:12:25.263 "compare_and_write": false, 00:12:25.263 "abort": true, 00:12:25.263 "seek_hole": false, 00:12:25.263 "seek_data": false, 00:12:25.263 "copy": true, 00:12:25.263 "nvme_iov_md": false 00:12:25.263 }, 00:12:25.263 "memory_domains": [ 00:12:25.263 { 00:12:25.263 "dma_device_id": "system", 00:12:25.263 "dma_device_type": 1 00:12:25.263 }, 00:12:25.263 { 00:12:25.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.263 "dma_device_type": 2 00:12:25.263 } 00:12:25.263 ], 00:12:25.263 "driver_specific": {} 00:12:25.263 } 00:12:25.263 ] 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:25.263 [2024-07-15 17:24:36.495921] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:25.263 [2024-07-15 17:24:36.495949] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:25.263 [2024-07-15 17:24:36.495961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:25.263 [2024-07-15 17:24:36.496989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.263 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.264 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:25.524 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:25.524 "name": "Existed_Raid", 00:12:25.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:25.524 "strip_size_kb": 64, 00:12:25.524 "state": "configuring", 00:12:25.524 "raid_level": "raid0", 00:12:25.524 "superblock": false, 00:12:25.524 "num_base_bdevs": 3, 00:12:25.524 "num_base_bdevs_discovered": 2, 00:12:25.524 "num_base_bdevs_operational": 3, 00:12:25.524 "base_bdevs_list": [ 00:12:25.524 { 00:12:25.524 "name": "BaseBdev1", 00:12:25.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:25.524 "is_configured": false, 00:12:25.524 "data_offset": 0, 00:12:25.524 "data_size": 0 00:12:25.524 }, 00:12:25.524 { 00:12:25.524 "name": "BaseBdev2", 00:12:25.524 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:25.524 "is_configured": true, 00:12:25.524 "data_offset": 0, 00:12:25.524 "data_size": 65536 00:12:25.524 }, 00:12:25.524 { 00:12:25.524 "name": "BaseBdev3", 00:12:25.524 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:25.524 "is_configured": true, 00:12:25.524 "data_offset": 0, 00:12:25.524 "data_size": 65536 00:12:25.524 } 00:12:25.524 ] 00:12:25.524 }' 00:12:25.524 17:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:25.524 17:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.095 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:26.356 [2024-07-15 17:24:37.398189] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.356 "name": "Existed_Raid", 00:12:26.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.356 "strip_size_kb": 64, 00:12:26.356 "state": "configuring", 00:12:26.356 "raid_level": "raid0", 00:12:26.356 "superblock": false, 00:12:26.356 "num_base_bdevs": 3, 00:12:26.356 "num_base_bdevs_discovered": 1, 00:12:26.356 "num_base_bdevs_operational": 3, 00:12:26.356 "base_bdevs_list": [ 00:12:26.356 { 00:12:26.356 "name": "BaseBdev1", 00:12:26.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.356 "is_configured": false, 00:12:26.356 "data_offset": 0, 00:12:26.356 "data_size": 0 00:12:26.356 }, 00:12:26.356 { 00:12:26.356 "name": null, 00:12:26.356 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:26.356 "is_configured": false, 00:12:26.356 "data_offset": 0, 00:12:26.356 "data_size": 65536 00:12:26.356 }, 00:12:26.356 { 00:12:26.356 "name": "BaseBdev3", 00:12:26.356 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:26.356 "is_configured": true, 00:12:26.356 "data_offset": 0, 00:12:26.356 "data_size": 65536 00:12:26.356 } 00:12:26.356 ] 00:12:26.356 }' 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.356 17:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.927 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.927 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:27.187 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:27.187 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:27.187 [2024-07-15 17:24:38.437651] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:27.187 BaseBdev1 00:12:27.187 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:27.187 17:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:27.187 17:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:27.187 17:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:27.187 17:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:27.187 17:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:27.187 17:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:27.448 17:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:27.709 [ 00:12:27.709 { 00:12:27.709 "name": "BaseBdev1", 00:12:27.709 "aliases": [ 00:12:27.709 "41c60992-ef77-4612-9193-fff02696de44" 00:12:27.709 ], 00:12:27.709 "product_name": "Malloc disk", 00:12:27.709 "block_size": 512, 00:12:27.709 "num_blocks": 65536, 00:12:27.709 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:27.709 "assigned_rate_limits": { 00:12:27.709 "rw_ios_per_sec": 0, 00:12:27.709 "rw_mbytes_per_sec": 0, 00:12:27.709 "r_mbytes_per_sec": 0, 00:12:27.709 "w_mbytes_per_sec": 0 00:12:27.709 }, 00:12:27.709 "claimed": true, 00:12:27.709 "claim_type": "exclusive_write", 00:12:27.709 "zoned": false, 00:12:27.709 "supported_io_types": { 00:12:27.709 "read": true, 00:12:27.709 "write": true, 00:12:27.709 "unmap": true, 00:12:27.709 "flush": true, 00:12:27.709 "reset": true, 00:12:27.709 "nvme_admin": false, 00:12:27.709 "nvme_io": false, 00:12:27.709 "nvme_io_md": false, 00:12:27.709 "write_zeroes": true, 00:12:27.709 "zcopy": true, 00:12:27.709 "get_zone_info": false, 00:12:27.709 "zone_management": false, 00:12:27.709 "zone_append": false, 00:12:27.709 "compare": false, 00:12:27.709 "compare_and_write": false, 00:12:27.709 "abort": true, 00:12:27.709 "seek_hole": false, 00:12:27.709 "seek_data": false, 00:12:27.709 "copy": true, 00:12:27.709 "nvme_iov_md": false 00:12:27.709 }, 00:12:27.709 "memory_domains": [ 00:12:27.709 { 00:12:27.709 "dma_device_id": "system", 00:12:27.709 "dma_device_type": 1 00:12:27.709 }, 00:12:27.709 { 00:12:27.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.709 "dma_device_type": 2 00:12:27.709 } 00:12:27.709 ], 00:12:27.709 "driver_specific": {} 00:12:27.709 } 00:12:27.709 ] 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.709 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.710 17:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:27.970 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:27.970 "name": "Existed_Raid", 00:12:27.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:27.970 "strip_size_kb": 64, 00:12:27.970 "state": "configuring", 00:12:27.970 "raid_level": "raid0", 00:12:27.970 "superblock": false, 00:12:27.970 "num_base_bdevs": 3, 00:12:27.970 "num_base_bdevs_discovered": 2, 00:12:27.970 "num_base_bdevs_operational": 3, 00:12:27.970 "base_bdevs_list": [ 00:12:27.970 { 00:12:27.970 "name": "BaseBdev1", 00:12:27.970 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:27.970 "is_configured": true, 00:12:27.970 "data_offset": 0, 00:12:27.970 "data_size": 65536 00:12:27.970 }, 00:12:27.970 { 00:12:27.970 "name": null, 00:12:27.970 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:27.970 "is_configured": false, 00:12:27.970 "data_offset": 0, 00:12:27.970 "data_size": 65536 00:12:27.970 }, 00:12:27.970 { 00:12:27.970 "name": "BaseBdev3", 00:12:27.970 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:27.970 "is_configured": true, 00:12:27.970 "data_offset": 0, 00:12:27.970 "data_size": 65536 00:12:27.970 } 00:12:27.970 ] 00:12:27.970 }' 00:12:27.970 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:27.970 17:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.541 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.541 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:28.541 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:28.541 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:28.801 [2024-07-15 17:24:39.905376] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.801 17:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.061 17:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.061 "name": "Existed_Raid", 00:12:29.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.061 "strip_size_kb": 64, 00:12:29.061 "state": "configuring", 00:12:29.061 "raid_level": "raid0", 00:12:29.061 "superblock": false, 00:12:29.061 "num_base_bdevs": 3, 00:12:29.061 "num_base_bdevs_discovered": 1, 00:12:29.061 "num_base_bdevs_operational": 3, 00:12:29.061 "base_bdevs_list": [ 00:12:29.061 { 00:12:29.061 "name": "BaseBdev1", 00:12:29.061 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:29.061 "is_configured": true, 00:12:29.061 "data_offset": 0, 00:12:29.061 "data_size": 65536 00:12:29.061 }, 00:12:29.061 { 00:12:29.061 "name": null, 00:12:29.061 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:29.061 "is_configured": false, 00:12:29.061 "data_offset": 0, 00:12:29.062 "data_size": 65536 00:12:29.062 }, 00:12:29.062 { 00:12:29.062 "name": null, 00:12:29.062 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:29.062 "is_configured": false, 00:12:29.062 "data_offset": 0, 00:12:29.062 "data_size": 65536 00:12:29.062 } 00:12:29.062 ] 00:12:29.062 }' 00:12:29.062 17:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.062 17:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.632 17:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.632 17:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:29.632 17:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:29.632 17:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:29.892 [2024-07-15 17:24:40.996152] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.892 "name": "Existed_Raid", 00:12:29.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.892 "strip_size_kb": 64, 00:12:29.892 "state": "configuring", 00:12:29.892 "raid_level": "raid0", 00:12:29.892 "superblock": false, 00:12:29.892 "num_base_bdevs": 3, 00:12:29.892 "num_base_bdevs_discovered": 2, 00:12:29.892 "num_base_bdevs_operational": 3, 00:12:29.892 "base_bdevs_list": [ 00:12:29.892 { 00:12:29.892 "name": "BaseBdev1", 00:12:29.892 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:29.892 "is_configured": true, 00:12:29.892 "data_offset": 0, 00:12:29.892 "data_size": 65536 00:12:29.892 }, 00:12:29.892 { 00:12:29.892 "name": null, 00:12:29.892 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:29.892 "is_configured": false, 00:12:29.892 "data_offset": 0, 00:12:29.892 "data_size": 65536 00:12:29.892 }, 00:12:29.892 { 00:12:29.892 "name": "BaseBdev3", 00:12:29.892 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:29.892 "is_configured": true, 00:12:29.892 "data_offset": 0, 00:12:29.892 "data_size": 65536 00:12:29.892 } 00:12:29.892 ] 00:12:29.892 }' 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.892 17:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:30.461 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.461 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:30.722 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:30.722 17:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:30.982 [2024-07-15 17:24:42.094943] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.982 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:31.243 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.243 "name": "Existed_Raid", 00:12:31.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:31.243 "strip_size_kb": 64, 00:12:31.243 "state": "configuring", 00:12:31.243 "raid_level": "raid0", 00:12:31.243 "superblock": false, 00:12:31.243 "num_base_bdevs": 3, 00:12:31.243 "num_base_bdevs_discovered": 1, 00:12:31.243 "num_base_bdevs_operational": 3, 00:12:31.243 "base_bdevs_list": [ 00:12:31.243 { 00:12:31.243 "name": null, 00:12:31.243 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:31.243 "is_configured": false, 00:12:31.243 "data_offset": 0, 00:12:31.243 "data_size": 65536 00:12:31.243 }, 00:12:31.243 { 00:12:31.243 "name": null, 00:12:31.243 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:31.243 "is_configured": false, 00:12:31.243 "data_offset": 0, 00:12:31.243 "data_size": 65536 00:12:31.243 }, 00:12:31.243 { 00:12:31.243 "name": "BaseBdev3", 00:12:31.243 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:31.243 "is_configured": true, 00:12:31.243 "data_offset": 0, 00:12:31.243 "data_size": 65536 00:12:31.243 } 00:12:31.243 ] 00:12:31.243 }' 00:12:31.243 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.243 17:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.814 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.814 17:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:31.814 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:31.814 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:32.077 [2024-07-15 17:24:43.203339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.077 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.339 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.339 "name": "Existed_Raid", 00:12:32.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.339 "strip_size_kb": 64, 00:12:32.339 "state": "configuring", 00:12:32.339 "raid_level": "raid0", 00:12:32.339 "superblock": false, 00:12:32.339 "num_base_bdevs": 3, 00:12:32.339 "num_base_bdevs_discovered": 2, 00:12:32.339 "num_base_bdevs_operational": 3, 00:12:32.339 "base_bdevs_list": [ 00:12:32.339 { 00:12:32.339 "name": null, 00:12:32.339 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:32.339 "is_configured": false, 00:12:32.339 "data_offset": 0, 00:12:32.339 "data_size": 65536 00:12:32.339 }, 00:12:32.339 { 00:12:32.339 "name": "BaseBdev2", 00:12:32.339 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:32.339 "is_configured": true, 00:12:32.339 "data_offset": 0, 00:12:32.339 "data_size": 65536 00:12:32.339 }, 00:12:32.339 { 00:12:32.339 "name": "BaseBdev3", 00:12:32.339 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:32.339 "is_configured": true, 00:12:32.339 "data_offset": 0, 00:12:32.339 "data_size": 65536 00:12:32.339 } 00:12:32.339 ] 00:12:32.339 }' 00:12:32.339 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.339 17:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.911 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.911 17:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:32.911 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:32.911 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.911 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:33.172 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 41c60992-ef77-4612-9193-fff02696de44 00:12:33.432 [2024-07-15 17:24:44.471377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:33.432 [2024-07-15 17:24:44.471401] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb3c780 00:12:33.432 [2024-07-15 17:24:44.471405] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:33.432 [2024-07-15 17:24:44.471548] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcee190 00:12:33.432 [2024-07-15 17:24:44.471635] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb3c780 00:12:33.432 [2024-07-15 17:24:44.471640] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb3c780 00:12:33.432 [2024-07-15 17:24:44.471762] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:33.432 NewBaseBdev 00:12:33.432 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:33.432 17:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:12:33.432 17:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:33.432 17:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:33.432 17:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:33.432 17:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:33.432 17:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:33.432 17:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:33.693 [ 00:12:33.693 { 00:12:33.693 "name": "NewBaseBdev", 00:12:33.693 "aliases": [ 00:12:33.693 "41c60992-ef77-4612-9193-fff02696de44" 00:12:33.693 ], 00:12:33.693 "product_name": "Malloc disk", 00:12:33.693 "block_size": 512, 00:12:33.693 "num_blocks": 65536, 00:12:33.693 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:33.693 "assigned_rate_limits": { 00:12:33.693 "rw_ios_per_sec": 0, 00:12:33.693 "rw_mbytes_per_sec": 0, 00:12:33.693 "r_mbytes_per_sec": 0, 00:12:33.693 "w_mbytes_per_sec": 0 00:12:33.693 }, 00:12:33.693 "claimed": true, 00:12:33.693 "claim_type": "exclusive_write", 00:12:33.693 "zoned": false, 00:12:33.693 "supported_io_types": { 00:12:33.693 "read": true, 00:12:33.693 "write": true, 00:12:33.693 "unmap": true, 00:12:33.693 "flush": true, 00:12:33.693 "reset": true, 00:12:33.693 "nvme_admin": false, 00:12:33.693 "nvme_io": false, 00:12:33.693 "nvme_io_md": false, 00:12:33.693 "write_zeroes": true, 00:12:33.693 "zcopy": true, 00:12:33.693 "get_zone_info": false, 00:12:33.693 "zone_management": false, 00:12:33.693 "zone_append": false, 00:12:33.693 "compare": false, 00:12:33.693 "compare_and_write": false, 00:12:33.693 "abort": true, 00:12:33.693 "seek_hole": false, 00:12:33.693 "seek_data": false, 00:12:33.693 "copy": true, 00:12:33.693 "nvme_iov_md": false 00:12:33.693 }, 00:12:33.693 "memory_domains": [ 00:12:33.693 { 00:12:33.693 "dma_device_id": "system", 00:12:33.693 "dma_device_type": 1 00:12:33.693 }, 00:12:33.693 { 00:12:33.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.693 "dma_device_type": 2 00:12:33.693 } 00:12:33.693 ], 00:12:33.693 "driver_specific": {} 00:12:33.693 } 00:12:33.693 ] 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.693 17:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.953 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.953 "name": "Existed_Raid", 00:12:33.953 "uuid": "10ee7c12-ebdf-44f9-ae72-ea9a3f004d64", 00:12:33.953 "strip_size_kb": 64, 00:12:33.953 "state": "online", 00:12:33.953 "raid_level": "raid0", 00:12:33.953 "superblock": false, 00:12:33.953 "num_base_bdevs": 3, 00:12:33.953 "num_base_bdevs_discovered": 3, 00:12:33.953 "num_base_bdevs_operational": 3, 00:12:33.953 "base_bdevs_list": [ 00:12:33.953 { 00:12:33.953 "name": "NewBaseBdev", 00:12:33.953 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:33.953 "is_configured": true, 00:12:33.953 "data_offset": 0, 00:12:33.953 "data_size": 65536 00:12:33.953 }, 00:12:33.953 { 00:12:33.953 "name": "BaseBdev2", 00:12:33.953 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:33.953 "is_configured": true, 00:12:33.953 "data_offset": 0, 00:12:33.953 "data_size": 65536 00:12:33.953 }, 00:12:33.953 { 00:12:33.953 "name": "BaseBdev3", 00:12:33.953 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:33.953 "is_configured": true, 00:12:33.953 "data_offset": 0, 00:12:33.953 "data_size": 65536 00:12:33.954 } 00:12:33.954 ] 00:12:33.954 }' 00:12:33.954 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.954 17:24:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:34.524 [2024-07-15 17:24:45.738821] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:34.524 "name": "Existed_Raid", 00:12:34.524 "aliases": [ 00:12:34.524 "10ee7c12-ebdf-44f9-ae72-ea9a3f004d64" 00:12:34.524 ], 00:12:34.524 "product_name": "Raid Volume", 00:12:34.524 "block_size": 512, 00:12:34.524 "num_blocks": 196608, 00:12:34.524 "uuid": "10ee7c12-ebdf-44f9-ae72-ea9a3f004d64", 00:12:34.524 "assigned_rate_limits": { 00:12:34.524 "rw_ios_per_sec": 0, 00:12:34.524 "rw_mbytes_per_sec": 0, 00:12:34.524 "r_mbytes_per_sec": 0, 00:12:34.524 "w_mbytes_per_sec": 0 00:12:34.524 }, 00:12:34.524 "claimed": false, 00:12:34.524 "zoned": false, 00:12:34.524 "supported_io_types": { 00:12:34.524 "read": true, 00:12:34.524 "write": true, 00:12:34.524 "unmap": true, 00:12:34.524 "flush": true, 00:12:34.524 "reset": true, 00:12:34.524 "nvme_admin": false, 00:12:34.524 "nvme_io": false, 00:12:34.524 "nvme_io_md": false, 00:12:34.524 "write_zeroes": true, 00:12:34.524 "zcopy": false, 00:12:34.524 "get_zone_info": false, 00:12:34.524 "zone_management": false, 00:12:34.524 "zone_append": false, 00:12:34.524 "compare": false, 00:12:34.524 "compare_and_write": false, 00:12:34.524 "abort": false, 00:12:34.524 "seek_hole": false, 00:12:34.524 "seek_data": false, 00:12:34.524 "copy": false, 00:12:34.524 "nvme_iov_md": false 00:12:34.524 }, 00:12:34.524 "memory_domains": [ 00:12:34.524 { 00:12:34.524 "dma_device_id": "system", 00:12:34.524 "dma_device_type": 1 00:12:34.524 }, 00:12:34.524 { 00:12:34.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.524 "dma_device_type": 2 00:12:34.524 }, 00:12:34.524 { 00:12:34.524 "dma_device_id": "system", 00:12:34.524 "dma_device_type": 1 00:12:34.524 }, 00:12:34.524 { 00:12:34.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.524 "dma_device_type": 2 00:12:34.524 }, 00:12:34.524 { 00:12:34.524 "dma_device_id": "system", 00:12:34.524 "dma_device_type": 1 00:12:34.524 }, 00:12:34.524 { 00:12:34.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.524 "dma_device_type": 2 00:12:34.524 } 00:12:34.524 ], 00:12:34.524 "driver_specific": { 00:12:34.524 "raid": { 00:12:34.524 "uuid": "10ee7c12-ebdf-44f9-ae72-ea9a3f004d64", 00:12:34.524 "strip_size_kb": 64, 00:12:34.524 "state": "online", 00:12:34.524 "raid_level": "raid0", 00:12:34.524 "superblock": false, 00:12:34.524 "num_base_bdevs": 3, 00:12:34.524 "num_base_bdevs_discovered": 3, 00:12:34.524 "num_base_bdevs_operational": 3, 00:12:34.524 "base_bdevs_list": [ 00:12:34.524 { 00:12:34.524 "name": "NewBaseBdev", 00:12:34.524 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:34.524 "is_configured": true, 00:12:34.524 "data_offset": 0, 00:12:34.524 "data_size": 65536 00:12:34.524 }, 00:12:34.524 { 00:12:34.524 "name": "BaseBdev2", 00:12:34.524 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:34.524 "is_configured": true, 00:12:34.524 "data_offset": 0, 00:12:34.524 "data_size": 65536 00:12:34.524 }, 00:12:34.524 { 00:12:34.524 "name": "BaseBdev3", 00:12:34.524 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:34.524 "is_configured": true, 00:12:34.524 "data_offset": 0, 00:12:34.524 "data_size": 65536 00:12:34.524 } 00:12:34.524 ] 00:12:34.524 } 00:12:34.524 } 00:12:34.524 }' 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:34.524 BaseBdev2 00:12:34.524 BaseBdev3' 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:34.524 17:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:34.784 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:34.784 "name": "NewBaseBdev", 00:12:34.784 "aliases": [ 00:12:34.784 "41c60992-ef77-4612-9193-fff02696de44" 00:12:34.784 ], 00:12:34.784 "product_name": "Malloc disk", 00:12:34.784 "block_size": 512, 00:12:34.784 "num_blocks": 65536, 00:12:34.784 "uuid": "41c60992-ef77-4612-9193-fff02696de44", 00:12:34.784 "assigned_rate_limits": { 00:12:34.784 "rw_ios_per_sec": 0, 00:12:34.784 "rw_mbytes_per_sec": 0, 00:12:34.784 "r_mbytes_per_sec": 0, 00:12:34.784 "w_mbytes_per_sec": 0 00:12:34.784 }, 00:12:34.784 "claimed": true, 00:12:34.784 "claim_type": "exclusive_write", 00:12:34.784 "zoned": false, 00:12:34.784 "supported_io_types": { 00:12:34.784 "read": true, 00:12:34.784 "write": true, 00:12:34.784 "unmap": true, 00:12:34.784 "flush": true, 00:12:34.784 "reset": true, 00:12:34.784 "nvme_admin": false, 00:12:34.784 "nvme_io": false, 00:12:34.784 "nvme_io_md": false, 00:12:34.784 "write_zeroes": true, 00:12:34.784 "zcopy": true, 00:12:34.784 "get_zone_info": false, 00:12:34.784 "zone_management": false, 00:12:34.784 "zone_append": false, 00:12:34.784 "compare": false, 00:12:34.784 "compare_and_write": false, 00:12:34.784 "abort": true, 00:12:34.784 "seek_hole": false, 00:12:34.784 "seek_data": false, 00:12:34.784 "copy": true, 00:12:34.784 "nvme_iov_md": false 00:12:34.784 }, 00:12:34.784 "memory_domains": [ 00:12:34.784 { 00:12:34.784 "dma_device_id": "system", 00:12:34.784 "dma_device_type": 1 00:12:34.784 }, 00:12:34.784 { 00:12:34.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.784 "dma_device_type": 2 00:12:34.784 } 00:12:34.784 ], 00:12:34.784 "driver_specific": {} 00:12:34.784 }' 00:12:34.784 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.784 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.044 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:35.044 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.044 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.044 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:35.044 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.044 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.044 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.044 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.044 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.305 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.305 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:35.305 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:35.305 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:35.305 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:35.305 "name": "BaseBdev2", 00:12:35.305 "aliases": [ 00:12:35.305 "e60e78a4-55d3-4aa7-8695-e52dc0449b61" 00:12:35.305 ], 00:12:35.305 "product_name": "Malloc disk", 00:12:35.305 "block_size": 512, 00:12:35.305 "num_blocks": 65536, 00:12:35.305 "uuid": "e60e78a4-55d3-4aa7-8695-e52dc0449b61", 00:12:35.305 "assigned_rate_limits": { 00:12:35.305 "rw_ios_per_sec": 0, 00:12:35.305 "rw_mbytes_per_sec": 0, 00:12:35.305 "r_mbytes_per_sec": 0, 00:12:35.305 "w_mbytes_per_sec": 0 00:12:35.305 }, 00:12:35.305 "claimed": true, 00:12:35.305 "claim_type": "exclusive_write", 00:12:35.305 "zoned": false, 00:12:35.305 "supported_io_types": { 00:12:35.305 "read": true, 00:12:35.305 "write": true, 00:12:35.305 "unmap": true, 00:12:35.305 "flush": true, 00:12:35.305 "reset": true, 00:12:35.305 "nvme_admin": false, 00:12:35.305 "nvme_io": false, 00:12:35.305 "nvme_io_md": false, 00:12:35.305 "write_zeroes": true, 00:12:35.305 "zcopy": true, 00:12:35.305 "get_zone_info": false, 00:12:35.305 "zone_management": false, 00:12:35.305 "zone_append": false, 00:12:35.305 "compare": false, 00:12:35.305 "compare_and_write": false, 00:12:35.305 "abort": true, 00:12:35.305 "seek_hole": false, 00:12:35.305 "seek_data": false, 00:12:35.305 "copy": true, 00:12:35.305 "nvme_iov_md": false 00:12:35.305 }, 00:12:35.305 "memory_domains": [ 00:12:35.305 { 00:12:35.305 "dma_device_id": "system", 00:12:35.305 "dma_device_type": 1 00:12:35.305 }, 00:12:35.305 { 00:12:35.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.305 "dma_device_type": 2 00:12:35.305 } 00:12:35.305 ], 00:12:35.305 "driver_specific": {} 00:12:35.305 }' 00:12:35.305 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.565 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.825 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.825 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:35.825 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:35.825 17:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:35.825 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:35.825 "name": "BaseBdev3", 00:12:35.825 "aliases": [ 00:12:35.825 "f786c743-01ac-45d5-9875-0eefb9222438" 00:12:35.825 ], 00:12:35.825 "product_name": "Malloc disk", 00:12:35.825 "block_size": 512, 00:12:35.825 "num_blocks": 65536, 00:12:35.825 "uuid": "f786c743-01ac-45d5-9875-0eefb9222438", 00:12:35.825 "assigned_rate_limits": { 00:12:35.825 "rw_ios_per_sec": 0, 00:12:35.825 "rw_mbytes_per_sec": 0, 00:12:35.825 "r_mbytes_per_sec": 0, 00:12:35.825 "w_mbytes_per_sec": 0 00:12:35.825 }, 00:12:35.825 "claimed": true, 00:12:35.825 "claim_type": "exclusive_write", 00:12:35.825 "zoned": false, 00:12:35.825 "supported_io_types": { 00:12:35.825 "read": true, 00:12:35.825 "write": true, 00:12:35.825 "unmap": true, 00:12:35.825 "flush": true, 00:12:35.825 "reset": true, 00:12:35.825 "nvme_admin": false, 00:12:35.825 "nvme_io": false, 00:12:35.825 "nvme_io_md": false, 00:12:35.825 "write_zeroes": true, 00:12:35.825 "zcopy": true, 00:12:35.825 "get_zone_info": false, 00:12:35.825 "zone_management": false, 00:12:35.825 "zone_append": false, 00:12:35.825 "compare": false, 00:12:35.825 "compare_and_write": false, 00:12:35.825 "abort": true, 00:12:35.825 "seek_hole": false, 00:12:35.825 "seek_data": false, 00:12:35.825 "copy": true, 00:12:35.825 "nvme_iov_md": false 00:12:35.825 }, 00:12:35.825 "memory_domains": [ 00:12:35.825 { 00:12:35.825 "dma_device_id": "system", 00:12:35.825 "dma_device_type": 1 00:12:35.825 }, 00:12:35.825 { 00:12:35.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.825 "dma_device_type": 2 00:12:35.825 } 00:12:35.825 ], 00:12:35.825 "driver_specific": {} 00:12:35.825 }' 00:12:35.825 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:36.086 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:36.086 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:36.086 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:36.086 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:36.086 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:36.086 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.086 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.086 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:36.086 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.346 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.346 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:36.346 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:36.346 [2024-07-15 17:24:47.607320] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:36.346 [2024-07-15 17:24:47.607336] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:36.346 [2024-07-15 17:24:47.607368] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:36.346 [2024-07-15 17:24:47.607404] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:36.346 [2024-07-15 17:24:47.607410] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb3c780 name Existed_Raid, state offline 00:12:36.346 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2763077 00:12:36.346 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2763077 ']' 00:12:36.346 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2763077 00:12:36.346 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:36.346 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:36.346 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2763077 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2763077' 00:12:36.606 killing process with pid 2763077 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2763077 00:12:36.606 [2024-07-15 17:24:47.674176] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2763077 00:12:36.606 [2024-07-15 17:24:47.688879] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:36.606 00:12:36.606 real 0m23.571s 00:12:36.606 user 0m44.218s 00:12:36.606 sys 0m3.477s 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.606 ************************************ 00:12:36.606 END TEST raid_state_function_test 00:12:36.606 ************************************ 00:12:36.606 17:24:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:36.606 17:24:47 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:12:36.606 17:24:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:36.606 17:24:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:36.606 17:24:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:36.606 ************************************ 00:12:36.606 START TEST raid_state_function_test_sb 00:12:36.606 ************************************ 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2767733 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2767733' 00:12:36.606 Process raid pid: 2767733 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2767733 /var/tmp/spdk-raid.sock 00:12:36.606 17:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:36.607 17:24:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2767733 ']' 00:12:36.607 17:24:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:36.607 17:24:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:36.607 17:24:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:36.607 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:36.607 17:24:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:36.607 17:24:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:36.867 [2024-07-15 17:24:47.987724] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:12:36.867 [2024-07-15 17:24:47.987854] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:36.867 [2024-07-15 17:24:48.131258] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:37.127 [2024-07-15 17:24:48.209937] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:37.127 [2024-07-15 17:24:48.252919] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:37.127 [2024-07-15 17:24:48.252942] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:37.697 [2024-07-15 17:24:48.956469] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:37.697 [2024-07-15 17:24:48.956499] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:37.697 [2024-07-15 17:24:48.956506] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:37.697 [2024-07-15 17:24:48.956511] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:37.697 [2024-07-15 17:24:48.956516] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:37.697 [2024-07-15 17:24:48.956521] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.697 17:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:37.957 17:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:37.957 "name": "Existed_Raid", 00:12:37.957 "uuid": "2277c09b-616e-4ae2-8edc-9c3fb113221a", 00:12:37.957 "strip_size_kb": 64, 00:12:37.957 "state": "configuring", 00:12:37.957 "raid_level": "raid0", 00:12:37.957 "superblock": true, 00:12:37.957 "num_base_bdevs": 3, 00:12:37.957 "num_base_bdevs_discovered": 0, 00:12:37.957 "num_base_bdevs_operational": 3, 00:12:37.957 "base_bdevs_list": [ 00:12:37.957 { 00:12:37.957 "name": "BaseBdev1", 00:12:37.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.957 "is_configured": false, 00:12:37.957 "data_offset": 0, 00:12:37.957 "data_size": 0 00:12:37.957 }, 00:12:37.957 { 00:12:37.957 "name": "BaseBdev2", 00:12:37.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.957 "is_configured": false, 00:12:37.957 "data_offset": 0, 00:12:37.957 "data_size": 0 00:12:37.957 }, 00:12:37.957 { 00:12:37.957 "name": "BaseBdev3", 00:12:37.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.957 "is_configured": false, 00:12:37.957 "data_offset": 0, 00:12:37.957 "data_size": 0 00:12:37.957 } 00:12:37.957 ] 00:12:37.957 }' 00:12:37.957 17:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:37.957 17:24:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:38.527 17:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:38.787 [2024-07-15 17:24:49.850593] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:38.787 [2024-07-15 17:24:49.850607] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19176d0 name Existed_Raid, state configuring 00:12:38.787 17:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:38.787 [2024-07-15 17:24:50.027081] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:38.787 [2024-07-15 17:24:50.027105] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:38.787 [2024-07-15 17:24:50.027110] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:38.787 [2024-07-15 17:24:50.027116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:38.787 [2024-07-15 17:24:50.027120] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:38.787 [2024-07-15 17:24:50.027125] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:38.787 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:39.047 [2024-07-15 17:24:50.222096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:39.047 BaseBdev1 00:12:39.047 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:39.047 17:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:39.047 17:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:39.047 17:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:39.047 17:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:39.047 17:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:39.047 17:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:39.306 17:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:39.306 [ 00:12:39.306 { 00:12:39.306 "name": "BaseBdev1", 00:12:39.306 "aliases": [ 00:12:39.306 "0d8bb55c-5964-4bc5-a5eb-6dfc1a4fd3d4" 00:12:39.306 ], 00:12:39.306 "product_name": "Malloc disk", 00:12:39.306 "block_size": 512, 00:12:39.306 "num_blocks": 65536, 00:12:39.306 "uuid": "0d8bb55c-5964-4bc5-a5eb-6dfc1a4fd3d4", 00:12:39.306 "assigned_rate_limits": { 00:12:39.306 "rw_ios_per_sec": 0, 00:12:39.306 "rw_mbytes_per_sec": 0, 00:12:39.306 "r_mbytes_per_sec": 0, 00:12:39.306 "w_mbytes_per_sec": 0 00:12:39.306 }, 00:12:39.306 "claimed": true, 00:12:39.306 "claim_type": "exclusive_write", 00:12:39.306 "zoned": false, 00:12:39.306 "supported_io_types": { 00:12:39.306 "read": true, 00:12:39.306 "write": true, 00:12:39.306 "unmap": true, 00:12:39.306 "flush": true, 00:12:39.306 "reset": true, 00:12:39.306 "nvme_admin": false, 00:12:39.306 "nvme_io": false, 00:12:39.306 "nvme_io_md": false, 00:12:39.306 "write_zeroes": true, 00:12:39.306 "zcopy": true, 00:12:39.306 "get_zone_info": false, 00:12:39.306 "zone_management": false, 00:12:39.306 "zone_append": false, 00:12:39.306 "compare": false, 00:12:39.306 "compare_and_write": false, 00:12:39.306 "abort": true, 00:12:39.306 "seek_hole": false, 00:12:39.306 "seek_data": false, 00:12:39.306 "copy": true, 00:12:39.306 "nvme_iov_md": false 00:12:39.306 }, 00:12:39.306 "memory_domains": [ 00:12:39.306 { 00:12:39.306 "dma_device_id": "system", 00:12:39.306 "dma_device_type": 1 00:12:39.306 }, 00:12:39.306 { 00:12:39.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.306 "dma_device_type": 2 00:12:39.306 } 00:12:39.306 ], 00:12:39.306 "driver_specific": {} 00:12:39.306 } 00:12:39.306 ] 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.588 "name": "Existed_Raid", 00:12:39.588 "uuid": "7d3b66c3-a8da-4d90-8150-bff69ee1f9a5", 00:12:39.588 "strip_size_kb": 64, 00:12:39.588 "state": "configuring", 00:12:39.588 "raid_level": "raid0", 00:12:39.588 "superblock": true, 00:12:39.588 "num_base_bdevs": 3, 00:12:39.588 "num_base_bdevs_discovered": 1, 00:12:39.588 "num_base_bdevs_operational": 3, 00:12:39.588 "base_bdevs_list": [ 00:12:39.588 { 00:12:39.588 "name": "BaseBdev1", 00:12:39.588 "uuid": "0d8bb55c-5964-4bc5-a5eb-6dfc1a4fd3d4", 00:12:39.588 "is_configured": true, 00:12:39.588 "data_offset": 2048, 00:12:39.588 "data_size": 63488 00:12:39.588 }, 00:12:39.588 { 00:12:39.588 "name": "BaseBdev2", 00:12:39.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.588 "is_configured": false, 00:12:39.588 "data_offset": 0, 00:12:39.588 "data_size": 0 00:12:39.588 }, 00:12:39.588 { 00:12:39.588 "name": "BaseBdev3", 00:12:39.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.588 "is_configured": false, 00:12:39.588 "data_offset": 0, 00:12:39.588 "data_size": 0 00:12:39.588 } 00:12:39.588 ] 00:12:39.588 }' 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.588 17:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:40.157 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:40.416 [2024-07-15 17:24:51.505347] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:40.416 [2024-07-15 17:24:51.505372] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1916fa0 name Existed_Raid, state configuring 00:12:40.416 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:40.416 [2024-07-15 17:24:51.713903] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:40.675 [2024-07-15 17:24:51.715023] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:40.675 [2024-07-15 17:24:51.715046] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:40.675 [2024-07-15 17:24:51.715051] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:40.675 [2024-07-15 17:24:51.715057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.675 "name": "Existed_Raid", 00:12:40.675 "uuid": "c981fb2a-5438-4371-95bf-a1d25a2dd942", 00:12:40.675 "strip_size_kb": 64, 00:12:40.675 "state": "configuring", 00:12:40.675 "raid_level": "raid0", 00:12:40.675 "superblock": true, 00:12:40.675 "num_base_bdevs": 3, 00:12:40.675 "num_base_bdevs_discovered": 1, 00:12:40.675 "num_base_bdevs_operational": 3, 00:12:40.675 "base_bdevs_list": [ 00:12:40.675 { 00:12:40.675 "name": "BaseBdev1", 00:12:40.675 "uuid": "0d8bb55c-5964-4bc5-a5eb-6dfc1a4fd3d4", 00:12:40.675 "is_configured": true, 00:12:40.675 "data_offset": 2048, 00:12:40.675 "data_size": 63488 00:12:40.675 }, 00:12:40.675 { 00:12:40.675 "name": "BaseBdev2", 00:12:40.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:40.675 "is_configured": false, 00:12:40.675 "data_offset": 0, 00:12:40.675 "data_size": 0 00:12:40.675 }, 00:12:40.675 { 00:12:40.675 "name": "BaseBdev3", 00:12:40.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:40.675 "is_configured": false, 00:12:40.675 "data_offset": 0, 00:12:40.675 "data_size": 0 00:12:40.675 } 00:12:40.675 ] 00:12:40.675 }' 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.675 17:24:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:41.243 17:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:41.503 [2024-07-15 17:24:52.653001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:41.503 BaseBdev2 00:12:41.503 17:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:41.503 17:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:41.503 17:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:41.503 17:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:41.503 17:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:41.503 17:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:41.503 17:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:41.763 17:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:41.763 [ 00:12:41.763 { 00:12:41.763 "name": "BaseBdev2", 00:12:41.763 "aliases": [ 00:12:41.763 "6940c44a-356e-4ab1-80ec-3f458ef051ce" 00:12:41.763 ], 00:12:41.763 "product_name": "Malloc disk", 00:12:41.763 "block_size": 512, 00:12:41.763 "num_blocks": 65536, 00:12:41.763 "uuid": "6940c44a-356e-4ab1-80ec-3f458ef051ce", 00:12:41.763 "assigned_rate_limits": { 00:12:41.763 "rw_ios_per_sec": 0, 00:12:41.763 "rw_mbytes_per_sec": 0, 00:12:41.763 "r_mbytes_per_sec": 0, 00:12:41.763 "w_mbytes_per_sec": 0 00:12:41.763 }, 00:12:41.763 "claimed": true, 00:12:41.763 "claim_type": "exclusive_write", 00:12:41.763 "zoned": false, 00:12:41.763 "supported_io_types": { 00:12:41.763 "read": true, 00:12:41.763 "write": true, 00:12:41.763 "unmap": true, 00:12:41.763 "flush": true, 00:12:41.763 "reset": true, 00:12:41.763 "nvme_admin": false, 00:12:41.763 "nvme_io": false, 00:12:41.763 "nvme_io_md": false, 00:12:41.763 "write_zeroes": true, 00:12:41.763 "zcopy": true, 00:12:41.763 "get_zone_info": false, 00:12:41.763 "zone_management": false, 00:12:41.763 "zone_append": false, 00:12:41.763 "compare": false, 00:12:41.763 "compare_and_write": false, 00:12:41.763 "abort": true, 00:12:41.763 "seek_hole": false, 00:12:41.763 "seek_data": false, 00:12:41.763 "copy": true, 00:12:41.763 "nvme_iov_md": false 00:12:41.763 }, 00:12:41.763 "memory_domains": [ 00:12:41.763 { 00:12:41.763 "dma_device_id": "system", 00:12:41.763 "dma_device_type": 1 00:12:41.763 }, 00:12:41.763 { 00:12:41.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.763 "dma_device_type": 2 00:12:41.763 } 00:12:41.763 ], 00:12:41.763 "driver_specific": {} 00:12:41.763 } 00:12:41.763 ] 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.763 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:42.023 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.023 "name": "Existed_Raid", 00:12:42.023 "uuid": "c981fb2a-5438-4371-95bf-a1d25a2dd942", 00:12:42.023 "strip_size_kb": 64, 00:12:42.023 "state": "configuring", 00:12:42.023 "raid_level": "raid0", 00:12:42.023 "superblock": true, 00:12:42.023 "num_base_bdevs": 3, 00:12:42.023 "num_base_bdevs_discovered": 2, 00:12:42.023 "num_base_bdevs_operational": 3, 00:12:42.023 "base_bdevs_list": [ 00:12:42.023 { 00:12:42.023 "name": "BaseBdev1", 00:12:42.023 "uuid": "0d8bb55c-5964-4bc5-a5eb-6dfc1a4fd3d4", 00:12:42.023 "is_configured": true, 00:12:42.023 "data_offset": 2048, 00:12:42.023 "data_size": 63488 00:12:42.023 }, 00:12:42.023 { 00:12:42.023 "name": "BaseBdev2", 00:12:42.023 "uuid": "6940c44a-356e-4ab1-80ec-3f458ef051ce", 00:12:42.023 "is_configured": true, 00:12:42.023 "data_offset": 2048, 00:12:42.023 "data_size": 63488 00:12:42.023 }, 00:12:42.023 { 00:12:42.023 "name": "BaseBdev3", 00:12:42.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.023 "is_configured": false, 00:12:42.023 "data_offset": 0, 00:12:42.023 "data_size": 0 00:12:42.023 } 00:12:42.023 ] 00:12:42.023 }' 00:12:42.023 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.023 17:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:42.596 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:42.856 [2024-07-15 17:24:53.909188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:42.856 [2024-07-15 17:24:53.909300] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1917e90 00:12:42.856 [2024-07-15 17:24:53.909308] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:42.856 [2024-07-15 17:24:53.909445] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1917b60 00:12:42.856 [2024-07-15 17:24:53.909536] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1917e90 00:12:42.856 [2024-07-15 17:24:53.909542] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1917e90 00:12:42.856 [2024-07-15 17:24:53.909609] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:42.856 BaseBdev3 00:12:42.856 17:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:42.856 17:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:42.856 17:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:42.856 17:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:42.856 17:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:42.856 17:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:42.856 17:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:42.856 17:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:43.116 [ 00:12:43.116 { 00:12:43.116 "name": "BaseBdev3", 00:12:43.116 "aliases": [ 00:12:43.116 "62156ab7-f8dd-43dc-a537-622a47775abc" 00:12:43.116 ], 00:12:43.116 "product_name": "Malloc disk", 00:12:43.116 "block_size": 512, 00:12:43.116 "num_blocks": 65536, 00:12:43.116 "uuid": "62156ab7-f8dd-43dc-a537-622a47775abc", 00:12:43.116 "assigned_rate_limits": { 00:12:43.116 "rw_ios_per_sec": 0, 00:12:43.116 "rw_mbytes_per_sec": 0, 00:12:43.116 "r_mbytes_per_sec": 0, 00:12:43.116 "w_mbytes_per_sec": 0 00:12:43.116 }, 00:12:43.116 "claimed": true, 00:12:43.116 "claim_type": "exclusive_write", 00:12:43.116 "zoned": false, 00:12:43.116 "supported_io_types": { 00:12:43.116 "read": true, 00:12:43.116 "write": true, 00:12:43.116 "unmap": true, 00:12:43.116 "flush": true, 00:12:43.116 "reset": true, 00:12:43.116 "nvme_admin": false, 00:12:43.116 "nvme_io": false, 00:12:43.116 "nvme_io_md": false, 00:12:43.116 "write_zeroes": true, 00:12:43.116 "zcopy": true, 00:12:43.116 "get_zone_info": false, 00:12:43.116 "zone_management": false, 00:12:43.116 "zone_append": false, 00:12:43.116 "compare": false, 00:12:43.116 "compare_and_write": false, 00:12:43.116 "abort": true, 00:12:43.116 "seek_hole": false, 00:12:43.116 "seek_data": false, 00:12:43.116 "copy": true, 00:12:43.116 "nvme_iov_md": false 00:12:43.116 }, 00:12:43.116 "memory_domains": [ 00:12:43.116 { 00:12:43.116 "dma_device_id": "system", 00:12:43.116 "dma_device_type": 1 00:12:43.116 }, 00:12:43.116 { 00:12:43.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.116 "dma_device_type": 2 00:12:43.116 } 00:12:43.116 ], 00:12:43.116 "driver_specific": {} 00:12:43.116 } 00:12:43.116 ] 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:43.116 "name": "Existed_Raid", 00:12:43.116 "uuid": "c981fb2a-5438-4371-95bf-a1d25a2dd942", 00:12:43.116 "strip_size_kb": 64, 00:12:43.116 "state": "online", 00:12:43.116 "raid_level": "raid0", 00:12:43.116 "superblock": true, 00:12:43.116 "num_base_bdevs": 3, 00:12:43.116 "num_base_bdevs_discovered": 3, 00:12:43.116 "num_base_bdevs_operational": 3, 00:12:43.116 "base_bdevs_list": [ 00:12:43.116 { 00:12:43.116 "name": "BaseBdev1", 00:12:43.116 "uuid": "0d8bb55c-5964-4bc5-a5eb-6dfc1a4fd3d4", 00:12:43.116 "is_configured": true, 00:12:43.116 "data_offset": 2048, 00:12:43.116 "data_size": 63488 00:12:43.116 }, 00:12:43.116 { 00:12:43.116 "name": "BaseBdev2", 00:12:43.116 "uuid": "6940c44a-356e-4ab1-80ec-3f458ef051ce", 00:12:43.116 "is_configured": true, 00:12:43.116 "data_offset": 2048, 00:12:43.116 "data_size": 63488 00:12:43.116 }, 00:12:43.116 { 00:12:43.116 "name": "BaseBdev3", 00:12:43.116 "uuid": "62156ab7-f8dd-43dc-a537-622a47775abc", 00:12:43.116 "is_configured": true, 00:12:43.116 "data_offset": 2048, 00:12:43.116 "data_size": 63488 00:12:43.116 } 00:12:43.116 ] 00:12:43.116 }' 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:43.116 17:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:43.686 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:43.686 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:43.686 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:43.686 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:43.686 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:43.686 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:43.686 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:43.686 17:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:43.948 [2024-07-15 17:24:55.092430] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:43.948 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:43.948 "name": "Existed_Raid", 00:12:43.948 "aliases": [ 00:12:43.948 "c981fb2a-5438-4371-95bf-a1d25a2dd942" 00:12:43.948 ], 00:12:43.948 "product_name": "Raid Volume", 00:12:43.948 "block_size": 512, 00:12:43.948 "num_blocks": 190464, 00:12:43.948 "uuid": "c981fb2a-5438-4371-95bf-a1d25a2dd942", 00:12:43.948 "assigned_rate_limits": { 00:12:43.948 "rw_ios_per_sec": 0, 00:12:43.948 "rw_mbytes_per_sec": 0, 00:12:43.948 "r_mbytes_per_sec": 0, 00:12:43.948 "w_mbytes_per_sec": 0 00:12:43.948 }, 00:12:43.948 "claimed": false, 00:12:43.948 "zoned": false, 00:12:43.948 "supported_io_types": { 00:12:43.948 "read": true, 00:12:43.948 "write": true, 00:12:43.948 "unmap": true, 00:12:43.948 "flush": true, 00:12:43.948 "reset": true, 00:12:43.948 "nvme_admin": false, 00:12:43.948 "nvme_io": false, 00:12:43.948 "nvme_io_md": false, 00:12:43.948 "write_zeroes": true, 00:12:43.948 "zcopy": false, 00:12:43.948 "get_zone_info": false, 00:12:43.948 "zone_management": false, 00:12:43.948 "zone_append": false, 00:12:43.948 "compare": false, 00:12:43.948 "compare_and_write": false, 00:12:43.948 "abort": false, 00:12:43.948 "seek_hole": false, 00:12:43.948 "seek_data": false, 00:12:43.948 "copy": false, 00:12:43.948 "nvme_iov_md": false 00:12:43.948 }, 00:12:43.948 "memory_domains": [ 00:12:43.948 { 00:12:43.948 "dma_device_id": "system", 00:12:43.948 "dma_device_type": 1 00:12:43.948 }, 00:12:43.948 { 00:12:43.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.948 "dma_device_type": 2 00:12:43.948 }, 00:12:43.948 { 00:12:43.948 "dma_device_id": "system", 00:12:43.948 "dma_device_type": 1 00:12:43.948 }, 00:12:43.948 { 00:12:43.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.948 "dma_device_type": 2 00:12:43.948 }, 00:12:43.948 { 00:12:43.948 "dma_device_id": "system", 00:12:43.948 "dma_device_type": 1 00:12:43.948 }, 00:12:43.948 { 00:12:43.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.948 "dma_device_type": 2 00:12:43.948 } 00:12:43.948 ], 00:12:43.948 "driver_specific": { 00:12:43.948 "raid": { 00:12:43.948 "uuid": "c981fb2a-5438-4371-95bf-a1d25a2dd942", 00:12:43.948 "strip_size_kb": 64, 00:12:43.948 "state": "online", 00:12:43.948 "raid_level": "raid0", 00:12:43.948 "superblock": true, 00:12:43.948 "num_base_bdevs": 3, 00:12:43.948 "num_base_bdevs_discovered": 3, 00:12:43.948 "num_base_bdevs_operational": 3, 00:12:43.948 "base_bdevs_list": [ 00:12:43.948 { 00:12:43.948 "name": "BaseBdev1", 00:12:43.948 "uuid": "0d8bb55c-5964-4bc5-a5eb-6dfc1a4fd3d4", 00:12:43.948 "is_configured": true, 00:12:43.948 "data_offset": 2048, 00:12:43.948 "data_size": 63488 00:12:43.948 }, 00:12:43.948 { 00:12:43.948 "name": "BaseBdev2", 00:12:43.948 "uuid": "6940c44a-356e-4ab1-80ec-3f458ef051ce", 00:12:43.948 "is_configured": true, 00:12:43.948 "data_offset": 2048, 00:12:43.948 "data_size": 63488 00:12:43.948 }, 00:12:43.948 { 00:12:43.948 "name": "BaseBdev3", 00:12:43.948 "uuid": "62156ab7-f8dd-43dc-a537-622a47775abc", 00:12:43.948 "is_configured": true, 00:12:43.948 "data_offset": 2048, 00:12:43.948 "data_size": 63488 00:12:43.948 } 00:12:43.948 ] 00:12:43.948 } 00:12:43.948 } 00:12:43.948 }' 00:12:43.948 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:43.948 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:43.948 BaseBdev2 00:12:43.948 BaseBdev3' 00:12:43.948 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:43.948 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:43.948 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:44.209 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:44.209 "name": "BaseBdev1", 00:12:44.209 "aliases": [ 00:12:44.209 "0d8bb55c-5964-4bc5-a5eb-6dfc1a4fd3d4" 00:12:44.209 ], 00:12:44.209 "product_name": "Malloc disk", 00:12:44.209 "block_size": 512, 00:12:44.209 "num_blocks": 65536, 00:12:44.209 "uuid": "0d8bb55c-5964-4bc5-a5eb-6dfc1a4fd3d4", 00:12:44.209 "assigned_rate_limits": { 00:12:44.209 "rw_ios_per_sec": 0, 00:12:44.209 "rw_mbytes_per_sec": 0, 00:12:44.209 "r_mbytes_per_sec": 0, 00:12:44.209 "w_mbytes_per_sec": 0 00:12:44.209 }, 00:12:44.209 "claimed": true, 00:12:44.209 "claim_type": "exclusive_write", 00:12:44.209 "zoned": false, 00:12:44.209 "supported_io_types": { 00:12:44.209 "read": true, 00:12:44.209 "write": true, 00:12:44.209 "unmap": true, 00:12:44.209 "flush": true, 00:12:44.209 "reset": true, 00:12:44.209 "nvme_admin": false, 00:12:44.209 "nvme_io": false, 00:12:44.209 "nvme_io_md": false, 00:12:44.209 "write_zeroes": true, 00:12:44.209 "zcopy": true, 00:12:44.209 "get_zone_info": false, 00:12:44.209 "zone_management": false, 00:12:44.209 "zone_append": false, 00:12:44.209 "compare": false, 00:12:44.209 "compare_and_write": false, 00:12:44.209 "abort": true, 00:12:44.209 "seek_hole": false, 00:12:44.209 "seek_data": false, 00:12:44.209 "copy": true, 00:12:44.209 "nvme_iov_md": false 00:12:44.209 }, 00:12:44.209 "memory_domains": [ 00:12:44.209 { 00:12:44.209 "dma_device_id": "system", 00:12:44.209 "dma_device_type": 1 00:12:44.209 }, 00:12:44.209 { 00:12:44.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.209 "dma_device_type": 2 00:12:44.209 } 00:12:44.209 ], 00:12:44.209 "driver_specific": {} 00:12:44.209 }' 00:12:44.209 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.209 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.209 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:44.209 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.209 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:44.469 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:44.729 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:44.729 "name": "BaseBdev2", 00:12:44.729 "aliases": [ 00:12:44.729 "6940c44a-356e-4ab1-80ec-3f458ef051ce" 00:12:44.729 ], 00:12:44.729 "product_name": "Malloc disk", 00:12:44.729 "block_size": 512, 00:12:44.729 "num_blocks": 65536, 00:12:44.729 "uuid": "6940c44a-356e-4ab1-80ec-3f458ef051ce", 00:12:44.729 "assigned_rate_limits": { 00:12:44.729 "rw_ios_per_sec": 0, 00:12:44.729 "rw_mbytes_per_sec": 0, 00:12:44.729 "r_mbytes_per_sec": 0, 00:12:44.729 "w_mbytes_per_sec": 0 00:12:44.729 }, 00:12:44.729 "claimed": true, 00:12:44.729 "claim_type": "exclusive_write", 00:12:44.729 "zoned": false, 00:12:44.729 "supported_io_types": { 00:12:44.729 "read": true, 00:12:44.729 "write": true, 00:12:44.729 "unmap": true, 00:12:44.729 "flush": true, 00:12:44.729 "reset": true, 00:12:44.729 "nvme_admin": false, 00:12:44.729 "nvme_io": false, 00:12:44.729 "nvme_io_md": false, 00:12:44.729 "write_zeroes": true, 00:12:44.729 "zcopy": true, 00:12:44.729 "get_zone_info": false, 00:12:44.729 "zone_management": false, 00:12:44.729 "zone_append": false, 00:12:44.729 "compare": false, 00:12:44.729 "compare_and_write": false, 00:12:44.729 "abort": true, 00:12:44.729 "seek_hole": false, 00:12:44.729 "seek_data": false, 00:12:44.729 "copy": true, 00:12:44.729 "nvme_iov_md": false 00:12:44.729 }, 00:12:44.729 "memory_domains": [ 00:12:44.729 { 00:12:44.729 "dma_device_id": "system", 00:12:44.729 "dma_device_type": 1 00:12:44.729 }, 00:12:44.729 { 00:12:44.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.729 "dma_device_type": 2 00:12:44.729 } 00:12:44.729 ], 00:12:44.729 "driver_specific": {} 00:12:44.729 }' 00:12:44.729 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.729 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.729 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:44.729 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.729 17:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.729 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:44.729 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.989 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.989 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:44.989 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.989 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.989 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:44.989 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:44.989 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:44.989 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:45.249 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:45.249 "name": "BaseBdev3", 00:12:45.249 "aliases": [ 00:12:45.249 "62156ab7-f8dd-43dc-a537-622a47775abc" 00:12:45.249 ], 00:12:45.249 "product_name": "Malloc disk", 00:12:45.249 "block_size": 512, 00:12:45.249 "num_blocks": 65536, 00:12:45.249 "uuid": "62156ab7-f8dd-43dc-a537-622a47775abc", 00:12:45.249 "assigned_rate_limits": { 00:12:45.249 "rw_ios_per_sec": 0, 00:12:45.249 "rw_mbytes_per_sec": 0, 00:12:45.249 "r_mbytes_per_sec": 0, 00:12:45.249 "w_mbytes_per_sec": 0 00:12:45.249 }, 00:12:45.249 "claimed": true, 00:12:45.249 "claim_type": "exclusive_write", 00:12:45.249 "zoned": false, 00:12:45.249 "supported_io_types": { 00:12:45.249 "read": true, 00:12:45.249 "write": true, 00:12:45.249 "unmap": true, 00:12:45.249 "flush": true, 00:12:45.250 "reset": true, 00:12:45.250 "nvme_admin": false, 00:12:45.250 "nvme_io": false, 00:12:45.250 "nvme_io_md": false, 00:12:45.250 "write_zeroes": true, 00:12:45.250 "zcopy": true, 00:12:45.250 "get_zone_info": false, 00:12:45.250 "zone_management": false, 00:12:45.250 "zone_append": false, 00:12:45.250 "compare": false, 00:12:45.250 "compare_and_write": false, 00:12:45.250 "abort": true, 00:12:45.250 "seek_hole": false, 00:12:45.250 "seek_data": false, 00:12:45.250 "copy": true, 00:12:45.250 "nvme_iov_md": false 00:12:45.250 }, 00:12:45.250 "memory_domains": [ 00:12:45.250 { 00:12:45.250 "dma_device_id": "system", 00:12:45.250 "dma_device_type": 1 00:12:45.250 }, 00:12:45.250 { 00:12:45.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.250 "dma_device_type": 2 00:12:45.250 } 00:12:45.250 ], 00:12:45.250 "driver_specific": {} 00:12:45.250 }' 00:12:45.250 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.250 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.250 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:45.250 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.250 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.250 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:45.250 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.509 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.509 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:45.509 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.509 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.509 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:45.510 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:45.770 [2024-07-15 17:24:56.892780] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:45.770 [2024-07-15 17:24:56.892796] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:45.770 [2024-07-15 17:24:56.892824] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.770 17:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:46.030 17:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.030 "name": "Existed_Raid", 00:12:46.030 "uuid": "c981fb2a-5438-4371-95bf-a1d25a2dd942", 00:12:46.030 "strip_size_kb": 64, 00:12:46.030 "state": "offline", 00:12:46.030 "raid_level": "raid0", 00:12:46.030 "superblock": true, 00:12:46.030 "num_base_bdevs": 3, 00:12:46.030 "num_base_bdevs_discovered": 2, 00:12:46.030 "num_base_bdevs_operational": 2, 00:12:46.030 "base_bdevs_list": [ 00:12:46.030 { 00:12:46.030 "name": null, 00:12:46.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:46.030 "is_configured": false, 00:12:46.030 "data_offset": 2048, 00:12:46.030 "data_size": 63488 00:12:46.030 }, 00:12:46.030 { 00:12:46.030 "name": "BaseBdev2", 00:12:46.030 "uuid": "6940c44a-356e-4ab1-80ec-3f458ef051ce", 00:12:46.030 "is_configured": true, 00:12:46.030 "data_offset": 2048, 00:12:46.030 "data_size": 63488 00:12:46.030 }, 00:12:46.030 { 00:12:46.030 "name": "BaseBdev3", 00:12:46.030 "uuid": "62156ab7-f8dd-43dc-a537-622a47775abc", 00:12:46.030 "is_configured": true, 00:12:46.030 "data_offset": 2048, 00:12:46.030 "data_size": 63488 00:12:46.030 } 00:12:46.030 ] 00:12:46.030 }' 00:12:46.030 17:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.030 17:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:46.600 17:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:46.600 17:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:46.600 17:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:46.600 17:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.600 17:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:46.600 17:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:46.600 17:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:46.859 [2024-07-15 17:24:58.007611] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:46.859 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:46.859 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:46.859 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.859 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:47.119 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:47.119 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:47.119 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:47.119 [2024-07-15 17:24:58.394368] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:47.119 [2024-07-15 17:24:58.394394] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1917e90 name Existed_Raid, state offline 00:12:47.119 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:47.119 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:47.119 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.119 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:47.380 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:47.380 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:47.380 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:47.380 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:47.380 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:47.380 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:47.640 BaseBdev2 00:12:47.640 17:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:47.640 17:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:47.640 17:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:47.640 17:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:47.640 17:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:47.640 17:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:47.640 17:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:47.901 17:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:47.901 [ 00:12:47.901 { 00:12:47.901 "name": "BaseBdev2", 00:12:47.901 "aliases": [ 00:12:47.901 "aefc7fee-dd8a-4745-ab51-2d4024998bbd" 00:12:47.901 ], 00:12:47.901 "product_name": "Malloc disk", 00:12:47.901 "block_size": 512, 00:12:47.901 "num_blocks": 65536, 00:12:47.901 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:47.901 "assigned_rate_limits": { 00:12:47.901 "rw_ios_per_sec": 0, 00:12:47.901 "rw_mbytes_per_sec": 0, 00:12:47.901 "r_mbytes_per_sec": 0, 00:12:47.901 "w_mbytes_per_sec": 0 00:12:47.901 }, 00:12:47.901 "claimed": false, 00:12:47.901 "zoned": false, 00:12:47.901 "supported_io_types": { 00:12:47.901 "read": true, 00:12:47.901 "write": true, 00:12:47.901 "unmap": true, 00:12:47.901 "flush": true, 00:12:47.901 "reset": true, 00:12:47.901 "nvme_admin": false, 00:12:47.901 "nvme_io": false, 00:12:47.901 "nvme_io_md": false, 00:12:47.901 "write_zeroes": true, 00:12:47.901 "zcopy": true, 00:12:47.901 "get_zone_info": false, 00:12:47.901 "zone_management": false, 00:12:47.901 "zone_append": false, 00:12:47.901 "compare": false, 00:12:47.901 "compare_and_write": false, 00:12:47.901 "abort": true, 00:12:47.901 "seek_hole": false, 00:12:47.901 "seek_data": false, 00:12:47.901 "copy": true, 00:12:47.901 "nvme_iov_md": false 00:12:47.901 }, 00:12:47.901 "memory_domains": [ 00:12:47.901 { 00:12:47.901 "dma_device_id": "system", 00:12:47.901 "dma_device_type": 1 00:12:47.901 }, 00:12:47.901 { 00:12:47.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.901 "dma_device_type": 2 00:12:47.901 } 00:12:47.901 ], 00:12:47.901 "driver_specific": {} 00:12:47.901 } 00:12:47.901 ] 00:12:47.901 17:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:47.901 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:47.901 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:47.901 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:48.162 BaseBdev3 00:12:48.162 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:48.162 17:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:48.162 17:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:48.162 17:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:48.162 17:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:48.162 17:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:48.162 17:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:48.423 17:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:48.423 [ 00:12:48.423 { 00:12:48.423 "name": "BaseBdev3", 00:12:48.423 "aliases": [ 00:12:48.423 "0d1be636-f209-4093-a99c-964758b3e07e" 00:12:48.423 ], 00:12:48.423 "product_name": "Malloc disk", 00:12:48.423 "block_size": 512, 00:12:48.423 "num_blocks": 65536, 00:12:48.423 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:48.423 "assigned_rate_limits": { 00:12:48.423 "rw_ios_per_sec": 0, 00:12:48.423 "rw_mbytes_per_sec": 0, 00:12:48.423 "r_mbytes_per_sec": 0, 00:12:48.423 "w_mbytes_per_sec": 0 00:12:48.423 }, 00:12:48.423 "claimed": false, 00:12:48.423 "zoned": false, 00:12:48.423 "supported_io_types": { 00:12:48.423 "read": true, 00:12:48.423 "write": true, 00:12:48.423 "unmap": true, 00:12:48.423 "flush": true, 00:12:48.423 "reset": true, 00:12:48.423 "nvme_admin": false, 00:12:48.423 "nvme_io": false, 00:12:48.423 "nvme_io_md": false, 00:12:48.423 "write_zeroes": true, 00:12:48.423 "zcopy": true, 00:12:48.423 "get_zone_info": false, 00:12:48.423 "zone_management": false, 00:12:48.423 "zone_append": false, 00:12:48.423 "compare": false, 00:12:48.423 "compare_and_write": false, 00:12:48.423 "abort": true, 00:12:48.423 "seek_hole": false, 00:12:48.423 "seek_data": false, 00:12:48.423 "copy": true, 00:12:48.423 "nvme_iov_md": false 00:12:48.423 }, 00:12:48.423 "memory_domains": [ 00:12:48.423 { 00:12:48.423 "dma_device_id": "system", 00:12:48.423 "dma_device_type": 1 00:12:48.423 }, 00:12:48.423 { 00:12:48.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.423 "dma_device_type": 2 00:12:48.423 } 00:12:48.423 ], 00:12:48.423 "driver_specific": {} 00:12:48.423 } 00:12:48.423 ] 00:12:48.423 17:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:48.683 [2024-07-15 17:24:59.893951] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:48.683 [2024-07-15 17:24:59.893979] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:48.683 [2024-07-15 17:24:59.893990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:48.683 [2024-07-15 17:24:59.895012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.683 17:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:48.943 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.943 "name": "Existed_Raid", 00:12:48.943 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:48.943 "strip_size_kb": 64, 00:12:48.943 "state": "configuring", 00:12:48.943 "raid_level": "raid0", 00:12:48.943 "superblock": true, 00:12:48.943 "num_base_bdevs": 3, 00:12:48.943 "num_base_bdevs_discovered": 2, 00:12:48.943 "num_base_bdevs_operational": 3, 00:12:48.943 "base_bdevs_list": [ 00:12:48.943 { 00:12:48.943 "name": "BaseBdev1", 00:12:48.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.943 "is_configured": false, 00:12:48.943 "data_offset": 0, 00:12:48.943 "data_size": 0 00:12:48.943 }, 00:12:48.943 { 00:12:48.943 "name": "BaseBdev2", 00:12:48.943 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:48.943 "is_configured": true, 00:12:48.943 "data_offset": 2048, 00:12:48.943 "data_size": 63488 00:12:48.944 }, 00:12:48.944 { 00:12:48.944 "name": "BaseBdev3", 00:12:48.944 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:48.944 "is_configured": true, 00:12:48.944 "data_offset": 2048, 00:12:48.944 "data_size": 63488 00:12:48.944 } 00:12:48.944 ] 00:12:48.944 }' 00:12:48.944 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.944 17:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:49.514 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:49.776 [2024-07-15 17:25:00.828553] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.776 17:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:49.776 17:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.776 "name": "Existed_Raid", 00:12:49.776 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:49.776 "strip_size_kb": 64, 00:12:49.776 "state": "configuring", 00:12:49.776 "raid_level": "raid0", 00:12:49.776 "superblock": true, 00:12:49.776 "num_base_bdevs": 3, 00:12:49.776 "num_base_bdevs_discovered": 1, 00:12:49.776 "num_base_bdevs_operational": 3, 00:12:49.776 "base_bdevs_list": [ 00:12:49.776 { 00:12:49.776 "name": "BaseBdev1", 00:12:49.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.776 "is_configured": false, 00:12:49.776 "data_offset": 0, 00:12:49.776 "data_size": 0 00:12:49.776 }, 00:12:49.776 { 00:12:49.776 "name": null, 00:12:49.776 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:49.776 "is_configured": false, 00:12:49.776 "data_offset": 2048, 00:12:49.776 "data_size": 63488 00:12:49.776 }, 00:12:49.776 { 00:12:49.776 "name": "BaseBdev3", 00:12:49.776 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:49.776 "is_configured": true, 00:12:49.776 "data_offset": 2048, 00:12:49.776 "data_size": 63488 00:12:49.776 } 00:12:49.776 ] 00:12:49.776 }' 00:12:49.776 17:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.776 17:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:50.346 17:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:50.346 17:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.606 17:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:50.606 17:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:50.866 [2024-07-15 17:25:01.984331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:50.866 BaseBdev1 00:12:50.866 17:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:50.866 17:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:50.866 17:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:50.866 17:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:50.866 17:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:50.866 17:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:50.866 17:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:51.126 [ 00:12:51.126 { 00:12:51.126 "name": "BaseBdev1", 00:12:51.126 "aliases": [ 00:12:51.126 "9355c893-4631-4ad8-b353-5a908114b04f" 00:12:51.126 ], 00:12:51.126 "product_name": "Malloc disk", 00:12:51.126 "block_size": 512, 00:12:51.126 "num_blocks": 65536, 00:12:51.126 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:12:51.126 "assigned_rate_limits": { 00:12:51.126 "rw_ios_per_sec": 0, 00:12:51.126 "rw_mbytes_per_sec": 0, 00:12:51.126 "r_mbytes_per_sec": 0, 00:12:51.126 "w_mbytes_per_sec": 0 00:12:51.126 }, 00:12:51.126 "claimed": true, 00:12:51.126 "claim_type": "exclusive_write", 00:12:51.126 "zoned": false, 00:12:51.126 "supported_io_types": { 00:12:51.126 "read": true, 00:12:51.126 "write": true, 00:12:51.126 "unmap": true, 00:12:51.126 "flush": true, 00:12:51.126 "reset": true, 00:12:51.126 "nvme_admin": false, 00:12:51.126 "nvme_io": false, 00:12:51.126 "nvme_io_md": false, 00:12:51.126 "write_zeroes": true, 00:12:51.126 "zcopy": true, 00:12:51.126 "get_zone_info": false, 00:12:51.126 "zone_management": false, 00:12:51.126 "zone_append": false, 00:12:51.126 "compare": false, 00:12:51.126 "compare_and_write": false, 00:12:51.126 "abort": true, 00:12:51.126 "seek_hole": false, 00:12:51.126 "seek_data": false, 00:12:51.126 "copy": true, 00:12:51.126 "nvme_iov_md": false 00:12:51.126 }, 00:12:51.126 "memory_domains": [ 00:12:51.126 { 00:12:51.126 "dma_device_id": "system", 00:12:51.126 "dma_device_type": 1 00:12:51.126 }, 00:12:51.126 { 00:12:51.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.126 "dma_device_type": 2 00:12:51.126 } 00:12:51.126 ], 00:12:51.126 "driver_specific": {} 00:12:51.126 } 00:12:51.126 ] 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.126 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.386 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.386 "name": "Existed_Raid", 00:12:51.386 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:51.386 "strip_size_kb": 64, 00:12:51.386 "state": "configuring", 00:12:51.386 "raid_level": "raid0", 00:12:51.386 "superblock": true, 00:12:51.386 "num_base_bdevs": 3, 00:12:51.386 "num_base_bdevs_discovered": 2, 00:12:51.386 "num_base_bdevs_operational": 3, 00:12:51.386 "base_bdevs_list": [ 00:12:51.386 { 00:12:51.386 "name": "BaseBdev1", 00:12:51.386 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:12:51.387 "is_configured": true, 00:12:51.387 "data_offset": 2048, 00:12:51.387 "data_size": 63488 00:12:51.387 }, 00:12:51.387 { 00:12:51.387 "name": null, 00:12:51.387 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:51.387 "is_configured": false, 00:12:51.387 "data_offset": 2048, 00:12:51.387 "data_size": 63488 00:12:51.387 }, 00:12:51.387 { 00:12:51.387 "name": "BaseBdev3", 00:12:51.387 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:51.387 "is_configured": true, 00:12:51.387 "data_offset": 2048, 00:12:51.387 "data_size": 63488 00:12:51.387 } 00:12:51.387 ] 00:12:51.387 }' 00:12:51.387 17:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.387 17:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:51.958 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:51.958 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.217 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:52.217 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:52.476 [2024-07-15 17:25:03.520250] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:52.476 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.476 "name": "Existed_Raid", 00:12:52.476 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:52.476 "strip_size_kb": 64, 00:12:52.476 "state": "configuring", 00:12:52.476 "raid_level": "raid0", 00:12:52.476 "superblock": true, 00:12:52.476 "num_base_bdevs": 3, 00:12:52.476 "num_base_bdevs_discovered": 1, 00:12:52.476 "num_base_bdevs_operational": 3, 00:12:52.476 "base_bdevs_list": [ 00:12:52.476 { 00:12:52.476 "name": "BaseBdev1", 00:12:52.477 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:12:52.477 "is_configured": true, 00:12:52.477 "data_offset": 2048, 00:12:52.477 "data_size": 63488 00:12:52.477 }, 00:12:52.477 { 00:12:52.477 "name": null, 00:12:52.477 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:52.477 "is_configured": false, 00:12:52.477 "data_offset": 2048, 00:12:52.477 "data_size": 63488 00:12:52.477 }, 00:12:52.477 { 00:12:52.477 "name": null, 00:12:52.477 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:52.477 "is_configured": false, 00:12:52.477 "data_offset": 2048, 00:12:52.477 "data_size": 63488 00:12:52.477 } 00:12:52.477 ] 00:12:52.477 }' 00:12:52.477 17:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.477 17:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:53.047 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.047 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:53.307 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:53.307 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:53.567 [2024-07-15 17:25:04.631073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:53.567 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:53.567 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:53.567 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.568 "name": "Existed_Raid", 00:12:53.568 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:53.568 "strip_size_kb": 64, 00:12:53.568 "state": "configuring", 00:12:53.568 "raid_level": "raid0", 00:12:53.568 "superblock": true, 00:12:53.568 "num_base_bdevs": 3, 00:12:53.568 "num_base_bdevs_discovered": 2, 00:12:53.568 "num_base_bdevs_operational": 3, 00:12:53.568 "base_bdevs_list": [ 00:12:53.568 { 00:12:53.568 "name": "BaseBdev1", 00:12:53.568 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:12:53.568 "is_configured": true, 00:12:53.568 "data_offset": 2048, 00:12:53.568 "data_size": 63488 00:12:53.568 }, 00:12:53.568 { 00:12:53.568 "name": null, 00:12:53.568 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:53.568 "is_configured": false, 00:12:53.568 "data_offset": 2048, 00:12:53.568 "data_size": 63488 00:12:53.568 }, 00:12:53.568 { 00:12:53.568 "name": "BaseBdev3", 00:12:53.568 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:53.568 "is_configured": true, 00:12:53.568 "data_offset": 2048, 00:12:53.568 "data_size": 63488 00:12:53.568 } 00:12:53.568 ] 00:12:53.568 }' 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.568 17:25:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:54.177 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.177 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:54.437 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:54.437 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:54.697 [2024-07-15 17:25:05.773983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.697 17:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:54.958 17:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.958 "name": "Existed_Raid", 00:12:54.958 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:54.958 "strip_size_kb": 64, 00:12:54.958 "state": "configuring", 00:12:54.958 "raid_level": "raid0", 00:12:54.958 "superblock": true, 00:12:54.958 "num_base_bdevs": 3, 00:12:54.958 "num_base_bdevs_discovered": 1, 00:12:54.958 "num_base_bdevs_operational": 3, 00:12:54.958 "base_bdevs_list": [ 00:12:54.958 { 00:12:54.958 "name": null, 00:12:54.958 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:12:54.958 "is_configured": false, 00:12:54.958 "data_offset": 2048, 00:12:54.958 "data_size": 63488 00:12:54.958 }, 00:12:54.958 { 00:12:54.958 "name": null, 00:12:54.958 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:54.958 "is_configured": false, 00:12:54.958 "data_offset": 2048, 00:12:54.958 "data_size": 63488 00:12:54.958 }, 00:12:54.958 { 00:12:54.958 "name": "BaseBdev3", 00:12:54.958 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:54.958 "is_configured": true, 00:12:54.958 "data_offset": 2048, 00:12:54.958 "data_size": 63488 00:12:54.958 } 00:12:54.958 ] 00:12:54.958 }' 00:12:54.958 17:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.958 17:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:55.896 17:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.896 17:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:55.896 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:55.896 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:56.156 [2024-07-15 17:25:07.279390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.156 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:56.726 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.726 "name": "Existed_Raid", 00:12:56.726 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:56.726 "strip_size_kb": 64, 00:12:56.726 "state": "configuring", 00:12:56.726 "raid_level": "raid0", 00:12:56.726 "superblock": true, 00:12:56.726 "num_base_bdevs": 3, 00:12:56.726 "num_base_bdevs_discovered": 2, 00:12:56.726 "num_base_bdevs_operational": 3, 00:12:56.726 "base_bdevs_list": [ 00:12:56.726 { 00:12:56.726 "name": null, 00:12:56.726 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:12:56.726 "is_configured": false, 00:12:56.726 "data_offset": 2048, 00:12:56.726 "data_size": 63488 00:12:56.726 }, 00:12:56.726 { 00:12:56.726 "name": "BaseBdev2", 00:12:56.726 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:56.726 "is_configured": true, 00:12:56.726 "data_offset": 2048, 00:12:56.726 "data_size": 63488 00:12:56.726 }, 00:12:56.726 { 00:12:56.726 "name": "BaseBdev3", 00:12:56.726 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:56.726 "is_configured": true, 00:12:56.726 "data_offset": 2048, 00:12:56.726 "data_size": 63488 00:12:56.726 } 00:12:56.726 ] 00:12:56.726 }' 00:12:56.726 17:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.726 17:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:57.667 17:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.667 17:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:57.667 17:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:57.667 17:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.667 17:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:57.926 17:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9355c893-4631-4ad8-b353-5a908114b04f 00:12:58.211 [2024-07-15 17:25:09.321472] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:58.211 [2024-07-15 17:25:09.321580] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19163d0 00:12:58.211 [2024-07-15 17:25:09.321587] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:58.211 [2024-07-15 17:25:09.321733] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ac4c50 00:12:58.211 [2024-07-15 17:25:09.321819] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19163d0 00:12:58.211 [2024-07-15 17:25:09.321825] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19163d0 00:12:58.211 [2024-07-15 17:25:09.321894] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:58.211 NewBaseBdev 00:12:58.211 17:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:58.211 17:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:12:58.211 17:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:58.211 17:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:58.211 17:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:58.211 17:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:58.211 17:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:58.471 17:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:59.043 [ 00:12:59.043 { 00:12:59.043 "name": "NewBaseBdev", 00:12:59.043 "aliases": [ 00:12:59.043 "9355c893-4631-4ad8-b353-5a908114b04f" 00:12:59.043 ], 00:12:59.043 "product_name": "Malloc disk", 00:12:59.043 "block_size": 512, 00:12:59.043 "num_blocks": 65536, 00:12:59.043 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:12:59.043 "assigned_rate_limits": { 00:12:59.043 "rw_ios_per_sec": 0, 00:12:59.043 "rw_mbytes_per_sec": 0, 00:12:59.043 "r_mbytes_per_sec": 0, 00:12:59.043 "w_mbytes_per_sec": 0 00:12:59.043 }, 00:12:59.043 "claimed": true, 00:12:59.043 "claim_type": "exclusive_write", 00:12:59.043 "zoned": false, 00:12:59.043 "supported_io_types": { 00:12:59.043 "read": true, 00:12:59.043 "write": true, 00:12:59.043 "unmap": true, 00:12:59.043 "flush": true, 00:12:59.043 "reset": true, 00:12:59.043 "nvme_admin": false, 00:12:59.043 "nvme_io": false, 00:12:59.043 "nvme_io_md": false, 00:12:59.043 "write_zeroes": true, 00:12:59.043 "zcopy": true, 00:12:59.043 "get_zone_info": false, 00:12:59.043 "zone_management": false, 00:12:59.043 "zone_append": false, 00:12:59.043 "compare": false, 00:12:59.043 "compare_and_write": false, 00:12:59.043 "abort": true, 00:12:59.043 "seek_hole": false, 00:12:59.043 "seek_data": false, 00:12:59.043 "copy": true, 00:12:59.043 "nvme_iov_md": false 00:12:59.043 }, 00:12:59.043 "memory_domains": [ 00:12:59.043 { 00:12:59.043 "dma_device_id": "system", 00:12:59.043 "dma_device_type": 1 00:12:59.043 }, 00:12:59.043 { 00:12:59.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.043 "dma_device_type": 2 00:12:59.043 } 00:12:59.043 ], 00:12:59.043 "driver_specific": {} 00:12:59.043 } 00:12:59.043 ] 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.043 "name": "Existed_Raid", 00:12:59.043 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:59.043 "strip_size_kb": 64, 00:12:59.043 "state": "online", 00:12:59.043 "raid_level": "raid0", 00:12:59.043 "superblock": true, 00:12:59.043 "num_base_bdevs": 3, 00:12:59.043 "num_base_bdevs_discovered": 3, 00:12:59.043 "num_base_bdevs_operational": 3, 00:12:59.043 "base_bdevs_list": [ 00:12:59.043 { 00:12:59.043 "name": "NewBaseBdev", 00:12:59.043 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:12:59.043 "is_configured": true, 00:12:59.043 "data_offset": 2048, 00:12:59.043 "data_size": 63488 00:12:59.043 }, 00:12:59.043 { 00:12:59.043 "name": "BaseBdev2", 00:12:59.043 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:59.043 "is_configured": true, 00:12:59.043 "data_offset": 2048, 00:12:59.043 "data_size": 63488 00:12:59.043 }, 00:12:59.043 { 00:12:59.043 "name": "BaseBdev3", 00:12:59.043 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:59.043 "is_configured": true, 00:12:59.043 "data_offset": 2048, 00:12:59.043 "data_size": 63488 00:12:59.043 } 00:12:59.043 ] 00:12:59.043 }' 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.043 17:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:59.613 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:59.613 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:59.613 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:59.613 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:59.613 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:59.613 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:59.613 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:59.613 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:59.874 [2024-07-15 17:25:10.973916] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:59.874 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:59.874 "name": "Existed_Raid", 00:12:59.874 "aliases": [ 00:12:59.874 "9cf3485a-c093-48de-86f2-8fb2e4923720" 00:12:59.874 ], 00:12:59.874 "product_name": "Raid Volume", 00:12:59.874 "block_size": 512, 00:12:59.874 "num_blocks": 190464, 00:12:59.874 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:59.874 "assigned_rate_limits": { 00:12:59.874 "rw_ios_per_sec": 0, 00:12:59.874 "rw_mbytes_per_sec": 0, 00:12:59.874 "r_mbytes_per_sec": 0, 00:12:59.874 "w_mbytes_per_sec": 0 00:12:59.874 }, 00:12:59.874 "claimed": false, 00:12:59.874 "zoned": false, 00:12:59.874 "supported_io_types": { 00:12:59.874 "read": true, 00:12:59.874 "write": true, 00:12:59.874 "unmap": true, 00:12:59.874 "flush": true, 00:12:59.874 "reset": true, 00:12:59.874 "nvme_admin": false, 00:12:59.874 "nvme_io": false, 00:12:59.874 "nvme_io_md": false, 00:12:59.874 "write_zeroes": true, 00:12:59.874 "zcopy": false, 00:12:59.874 "get_zone_info": false, 00:12:59.874 "zone_management": false, 00:12:59.874 "zone_append": false, 00:12:59.874 "compare": false, 00:12:59.874 "compare_and_write": false, 00:12:59.874 "abort": false, 00:12:59.874 "seek_hole": false, 00:12:59.874 "seek_data": false, 00:12:59.874 "copy": false, 00:12:59.874 "nvme_iov_md": false 00:12:59.874 }, 00:12:59.874 "memory_domains": [ 00:12:59.874 { 00:12:59.874 "dma_device_id": "system", 00:12:59.874 "dma_device_type": 1 00:12:59.874 }, 00:12:59.874 { 00:12:59.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.874 "dma_device_type": 2 00:12:59.874 }, 00:12:59.874 { 00:12:59.874 "dma_device_id": "system", 00:12:59.874 "dma_device_type": 1 00:12:59.874 }, 00:12:59.874 { 00:12:59.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.874 "dma_device_type": 2 00:12:59.874 }, 00:12:59.874 { 00:12:59.874 "dma_device_id": "system", 00:12:59.874 "dma_device_type": 1 00:12:59.874 }, 00:12:59.874 { 00:12:59.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.874 "dma_device_type": 2 00:12:59.874 } 00:12:59.874 ], 00:12:59.874 "driver_specific": { 00:12:59.874 "raid": { 00:12:59.874 "uuid": "9cf3485a-c093-48de-86f2-8fb2e4923720", 00:12:59.874 "strip_size_kb": 64, 00:12:59.874 "state": "online", 00:12:59.874 "raid_level": "raid0", 00:12:59.874 "superblock": true, 00:12:59.874 "num_base_bdevs": 3, 00:12:59.874 "num_base_bdevs_discovered": 3, 00:12:59.874 "num_base_bdevs_operational": 3, 00:12:59.874 "base_bdevs_list": [ 00:12:59.874 { 00:12:59.874 "name": "NewBaseBdev", 00:12:59.874 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:12:59.874 "is_configured": true, 00:12:59.874 "data_offset": 2048, 00:12:59.874 "data_size": 63488 00:12:59.874 }, 00:12:59.874 { 00:12:59.874 "name": "BaseBdev2", 00:12:59.874 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:12:59.874 "is_configured": true, 00:12:59.874 "data_offset": 2048, 00:12:59.874 "data_size": 63488 00:12:59.874 }, 00:12:59.874 { 00:12:59.874 "name": "BaseBdev3", 00:12:59.874 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:12:59.874 "is_configured": true, 00:12:59.874 "data_offset": 2048, 00:12:59.874 "data_size": 63488 00:12:59.874 } 00:12:59.874 ] 00:12:59.874 } 00:12:59.874 } 00:12:59.874 }' 00:12:59.874 17:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:59.874 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:59.874 BaseBdev2 00:12:59.874 BaseBdev3' 00:12:59.874 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:59.874 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:59.874 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:00.135 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:00.135 "name": "NewBaseBdev", 00:13:00.135 "aliases": [ 00:13:00.135 "9355c893-4631-4ad8-b353-5a908114b04f" 00:13:00.135 ], 00:13:00.135 "product_name": "Malloc disk", 00:13:00.135 "block_size": 512, 00:13:00.135 "num_blocks": 65536, 00:13:00.135 "uuid": "9355c893-4631-4ad8-b353-5a908114b04f", 00:13:00.135 "assigned_rate_limits": { 00:13:00.135 "rw_ios_per_sec": 0, 00:13:00.135 "rw_mbytes_per_sec": 0, 00:13:00.135 "r_mbytes_per_sec": 0, 00:13:00.135 "w_mbytes_per_sec": 0 00:13:00.135 }, 00:13:00.135 "claimed": true, 00:13:00.135 "claim_type": "exclusive_write", 00:13:00.135 "zoned": false, 00:13:00.135 "supported_io_types": { 00:13:00.135 "read": true, 00:13:00.136 "write": true, 00:13:00.136 "unmap": true, 00:13:00.136 "flush": true, 00:13:00.136 "reset": true, 00:13:00.136 "nvme_admin": false, 00:13:00.136 "nvme_io": false, 00:13:00.136 "nvme_io_md": false, 00:13:00.136 "write_zeroes": true, 00:13:00.136 "zcopy": true, 00:13:00.136 "get_zone_info": false, 00:13:00.136 "zone_management": false, 00:13:00.136 "zone_append": false, 00:13:00.136 "compare": false, 00:13:00.136 "compare_and_write": false, 00:13:00.136 "abort": true, 00:13:00.136 "seek_hole": false, 00:13:00.136 "seek_data": false, 00:13:00.136 "copy": true, 00:13:00.136 "nvme_iov_md": false 00:13:00.136 }, 00:13:00.136 "memory_domains": [ 00:13:00.136 { 00:13:00.136 "dma_device_id": "system", 00:13:00.136 "dma_device_type": 1 00:13:00.136 }, 00:13:00.136 { 00:13:00.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.136 "dma_device_type": 2 00:13:00.136 } 00:13:00.136 ], 00:13:00.136 "driver_specific": {} 00:13:00.136 }' 00:13:00.136 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.136 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.136 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:00.136 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:00.136 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:00.136 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:00.136 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:00.136 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:00.396 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:00.396 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:00.396 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:00.396 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:00.396 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:00.396 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:00.396 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:00.656 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:00.656 "name": "BaseBdev2", 00:13:00.656 "aliases": [ 00:13:00.656 "aefc7fee-dd8a-4745-ab51-2d4024998bbd" 00:13:00.656 ], 00:13:00.656 "product_name": "Malloc disk", 00:13:00.656 "block_size": 512, 00:13:00.656 "num_blocks": 65536, 00:13:00.656 "uuid": "aefc7fee-dd8a-4745-ab51-2d4024998bbd", 00:13:00.656 "assigned_rate_limits": { 00:13:00.656 "rw_ios_per_sec": 0, 00:13:00.656 "rw_mbytes_per_sec": 0, 00:13:00.656 "r_mbytes_per_sec": 0, 00:13:00.656 "w_mbytes_per_sec": 0 00:13:00.656 }, 00:13:00.656 "claimed": true, 00:13:00.656 "claim_type": "exclusive_write", 00:13:00.656 "zoned": false, 00:13:00.656 "supported_io_types": { 00:13:00.656 "read": true, 00:13:00.656 "write": true, 00:13:00.656 "unmap": true, 00:13:00.656 "flush": true, 00:13:00.656 "reset": true, 00:13:00.656 "nvme_admin": false, 00:13:00.656 "nvme_io": false, 00:13:00.656 "nvme_io_md": false, 00:13:00.656 "write_zeroes": true, 00:13:00.656 "zcopy": true, 00:13:00.656 "get_zone_info": false, 00:13:00.656 "zone_management": false, 00:13:00.656 "zone_append": false, 00:13:00.656 "compare": false, 00:13:00.656 "compare_and_write": false, 00:13:00.656 "abort": true, 00:13:00.656 "seek_hole": false, 00:13:00.656 "seek_data": false, 00:13:00.656 "copy": true, 00:13:00.656 "nvme_iov_md": false 00:13:00.656 }, 00:13:00.656 "memory_domains": [ 00:13:00.656 { 00:13:00.656 "dma_device_id": "system", 00:13:00.656 "dma_device_type": 1 00:13:00.656 }, 00:13:00.656 { 00:13:00.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.656 "dma_device_type": 2 00:13:00.656 } 00:13:00.656 ], 00:13:00.656 "driver_specific": {} 00:13:00.656 }' 00:13:00.656 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.656 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.656 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:00.656 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:00.656 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:00.656 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:00.656 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:00.656 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:00.916 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:00.916 17:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:00.916 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:00.916 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:00.916 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:00.916 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:00.916 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:01.175 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:01.175 "name": "BaseBdev3", 00:13:01.175 "aliases": [ 00:13:01.175 "0d1be636-f209-4093-a99c-964758b3e07e" 00:13:01.175 ], 00:13:01.175 "product_name": "Malloc disk", 00:13:01.175 "block_size": 512, 00:13:01.175 "num_blocks": 65536, 00:13:01.175 "uuid": "0d1be636-f209-4093-a99c-964758b3e07e", 00:13:01.175 "assigned_rate_limits": { 00:13:01.175 "rw_ios_per_sec": 0, 00:13:01.175 "rw_mbytes_per_sec": 0, 00:13:01.175 "r_mbytes_per_sec": 0, 00:13:01.175 "w_mbytes_per_sec": 0 00:13:01.175 }, 00:13:01.175 "claimed": true, 00:13:01.175 "claim_type": "exclusive_write", 00:13:01.175 "zoned": false, 00:13:01.175 "supported_io_types": { 00:13:01.175 "read": true, 00:13:01.175 "write": true, 00:13:01.175 "unmap": true, 00:13:01.175 "flush": true, 00:13:01.175 "reset": true, 00:13:01.175 "nvme_admin": false, 00:13:01.175 "nvme_io": false, 00:13:01.175 "nvme_io_md": false, 00:13:01.175 "write_zeroes": true, 00:13:01.175 "zcopy": true, 00:13:01.175 "get_zone_info": false, 00:13:01.175 "zone_management": false, 00:13:01.175 "zone_append": false, 00:13:01.175 "compare": false, 00:13:01.175 "compare_and_write": false, 00:13:01.175 "abort": true, 00:13:01.175 "seek_hole": false, 00:13:01.175 "seek_data": false, 00:13:01.175 "copy": true, 00:13:01.175 "nvme_iov_md": false 00:13:01.175 }, 00:13:01.175 "memory_domains": [ 00:13:01.175 { 00:13:01.175 "dma_device_id": "system", 00:13:01.175 "dma_device_type": 1 00:13:01.175 }, 00:13:01.175 { 00:13:01.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.175 "dma_device_type": 2 00:13:01.175 } 00:13:01.175 ], 00:13:01.175 "driver_specific": {} 00:13:01.175 }' 00:13:01.175 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:01.175 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:01.175 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:01.175 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:01.175 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:01.175 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:01.175 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:01.435 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:01.435 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:01.435 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:01.435 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:01.435 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:01.435 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:01.694 [2024-07-15 17:25:12.806354] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:01.694 [2024-07-15 17:25:12.806373] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:01.694 [2024-07-15 17:25:12.806409] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:01.694 [2024-07-15 17:25:12.806447] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:01.694 [2024-07-15 17:25:12.806453] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19163d0 name Existed_Raid, state offline 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2767733 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2767733 ']' 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2767733 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2767733 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2767733' 00:13:01.694 killing process with pid 2767733 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2767733 00:13:01.694 [2024-07-15 17:25:12.873550] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:01.694 17:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2767733 00:13:01.694 [2024-07-15 17:25:12.888234] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:01.954 17:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:01.954 00:13:01.954 real 0m25.126s 00:13:01.954 user 0m47.206s 00:13:01.954 sys 0m3.652s 00:13:01.954 17:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:01.954 17:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:01.954 ************************************ 00:13:01.954 END TEST raid_state_function_test_sb 00:13:01.954 ************************************ 00:13:01.954 17:25:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:01.954 17:25:13 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:13:01.954 17:25:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:01.954 17:25:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:01.954 17:25:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:01.954 ************************************ 00:13:01.954 START TEST raid_superblock_test 00:13:01.954 ************************************ 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2772528 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2772528 /var/tmp/spdk-raid.sock 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2772528 ']' 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:01.954 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:01.954 17:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.954 [2024-07-15 17:25:13.134299] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:13:01.954 [2024-07-15 17:25:13.134347] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2772528 ] 00:13:01.954 [2024-07-15 17:25:13.221108] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:02.214 [2024-07-15 17:25:13.286211] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:02.214 [2024-07-15 17:25:13.331714] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:02.214 [2024-07-15 17:25:13.331736] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:02.782 17:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:03.041 malloc1 00:13:03.041 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:03.041 [2024-07-15 17:25:14.338142] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:03.041 [2024-07-15 17:25:14.338174] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.041 [2024-07-15 17:25:14.338185] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20dda20 00:13:03.041 [2024-07-15 17:25:14.338192] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.301 [2024-07-15 17:25:14.339495] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.301 [2024-07-15 17:25:14.339515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:03.301 pt1 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:03.301 malloc2 00:13:03.301 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:03.561 [2024-07-15 17:25:14.717119] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:03.561 [2024-07-15 17:25:14.717148] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.561 [2024-07-15 17:25:14.717160] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20de040 00:13:03.561 [2024-07-15 17:25:14.717167] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.561 [2024-07-15 17:25:14.718351] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.561 [2024-07-15 17:25:14.718370] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:03.561 pt2 00:13:03.561 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:03.561 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:03.561 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:03.561 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:03.561 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:03.561 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:03.561 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:03.561 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:03.561 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:03.820 malloc3 00:13:03.820 17:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:03.820 [2024-07-15 17:25:15.096165] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:03.820 [2024-07-15 17:25:15.096192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.820 [2024-07-15 17:25:15.096205] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20de540 00:13:03.820 [2024-07-15 17:25:15.096212] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.820 [2024-07-15 17:25:15.097396] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.820 [2024-07-15 17:25:15.097415] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:03.820 pt3 00:13:03.820 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:03.820 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:03.820 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:04.079 [2024-07-15 17:25:15.272625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:04.079 [2024-07-15 17:25:15.273620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:04.079 [2024-07-15 17:25:15.273660] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:04.079 [2024-07-15 17:25:15.273785] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x228aa90 00:13:04.079 [2024-07-15 17:25:15.273792] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:04.079 [2024-07-15 17:25:15.273939] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2286c50 00:13:04.079 [2024-07-15 17:25:15.274046] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x228aa90 00:13:04.079 [2024-07-15 17:25:15.274051] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x228aa90 00:13:04.079 [2024-07-15 17:25:15.274119] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.080 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:04.339 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:04.339 "name": "raid_bdev1", 00:13:04.339 "uuid": "2f88ac5e-7064-40c5-976c-fee9cd641f5e", 00:13:04.339 "strip_size_kb": 64, 00:13:04.339 "state": "online", 00:13:04.339 "raid_level": "raid0", 00:13:04.339 "superblock": true, 00:13:04.339 "num_base_bdevs": 3, 00:13:04.339 "num_base_bdevs_discovered": 3, 00:13:04.339 "num_base_bdevs_operational": 3, 00:13:04.339 "base_bdevs_list": [ 00:13:04.339 { 00:13:04.339 "name": "pt1", 00:13:04.339 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:04.339 "is_configured": true, 00:13:04.339 "data_offset": 2048, 00:13:04.339 "data_size": 63488 00:13:04.339 }, 00:13:04.339 { 00:13:04.339 "name": "pt2", 00:13:04.339 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:04.339 "is_configured": true, 00:13:04.339 "data_offset": 2048, 00:13:04.339 "data_size": 63488 00:13:04.339 }, 00:13:04.339 { 00:13:04.339 "name": "pt3", 00:13:04.339 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:04.339 "is_configured": true, 00:13:04.339 "data_offset": 2048, 00:13:04.339 "data_size": 63488 00:13:04.339 } 00:13:04.339 ] 00:13:04.339 }' 00:13:04.339 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:04.339 17:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.908 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:04.908 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:04.908 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:04.908 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:04.908 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:04.908 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:04.908 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:04.908 17:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:04.908 [2024-07-15 17:25:16.171101] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:04.908 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:04.908 "name": "raid_bdev1", 00:13:04.908 "aliases": [ 00:13:04.908 "2f88ac5e-7064-40c5-976c-fee9cd641f5e" 00:13:04.908 ], 00:13:04.908 "product_name": "Raid Volume", 00:13:04.908 "block_size": 512, 00:13:04.908 "num_blocks": 190464, 00:13:04.908 "uuid": "2f88ac5e-7064-40c5-976c-fee9cd641f5e", 00:13:04.908 "assigned_rate_limits": { 00:13:04.908 "rw_ios_per_sec": 0, 00:13:04.908 "rw_mbytes_per_sec": 0, 00:13:04.908 "r_mbytes_per_sec": 0, 00:13:04.908 "w_mbytes_per_sec": 0 00:13:04.908 }, 00:13:04.908 "claimed": false, 00:13:04.908 "zoned": false, 00:13:04.908 "supported_io_types": { 00:13:04.908 "read": true, 00:13:04.908 "write": true, 00:13:04.908 "unmap": true, 00:13:04.908 "flush": true, 00:13:04.908 "reset": true, 00:13:04.908 "nvme_admin": false, 00:13:04.908 "nvme_io": false, 00:13:04.908 "nvme_io_md": false, 00:13:04.908 "write_zeroes": true, 00:13:04.908 "zcopy": false, 00:13:04.908 "get_zone_info": false, 00:13:04.908 "zone_management": false, 00:13:04.908 "zone_append": false, 00:13:04.908 "compare": false, 00:13:04.908 "compare_and_write": false, 00:13:04.908 "abort": false, 00:13:04.908 "seek_hole": false, 00:13:04.908 "seek_data": false, 00:13:04.908 "copy": false, 00:13:04.908 "nvme_iov_md": false 00:13:04.908 }, 00:13:04.908 "memory_domains": [ 00:13:04.908 { 00:13:04.908 "dma_device_id": "system", 00:13:04.908 "dma_device_type": 1 00:13:04.908 }, 00:13:04.908 { 00:13:04.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.908 "dma_device_type": 2 00:13:04.908 }, 00:13:04.908 { 00:13:04.908 "dma_device_id": "system", 00:13:04.908 "dma_device_type": 1 00:13:04.908 }, 00:13:04.908 { 00:13:04.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.908 "dma_device_type": 2 00:13:04.908 }, 00:13:04.908 { 00:13:04.908 "dma_device_id": "system", 00:13:04.908 "dma_device_type": 1 00:13:04.908 }, 00:13:04.908 { 00:13:04.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.908 "dma_device_type": 2 00:13:04.908 } 00:13:04.908 ], 00:13:04.908 "driver_specific": { 00:13:04.908 "raid": { 00:13:04.908 "uuid": "2f88ac5e-7064-40c5-976c-fee9cd641f5e", 00:13:04.908 "strip_size_kb": 64, 00:13:04.908 "state": "online", 00:13:04.908 "raid_level": "raid0", 00:13:04.908 "superblock": true, 00:13:04.908 "num_base_bdevs": 3, 00:13:04.908 "num_base_bdevs_discovered": 3, 00:13:04.908 "num_base_bdevs_operational": 3, 00:13:04.908 "base_bdevs_list": [ 00:13:04.908 { 00:13:04.908 "name": "pt1", 00:13:04.908 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:04.908 "is_configured": true, 00:13:04.908 "data_offset": 2048, 00:13:04.908 "data_size": 63488 00:13:04.908 }, 00:13:04.908 { 00:13:04.908 "name": "pt2", 00:13:04.908 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:04.908 "is_configured": true, 00:13:04.908 "data_offset": 2048, 00:13:04.908 "data_size": 63488 00:13:04.908 }, 00:13:04.908 { 00:13:04.908 "name": "pt3", 00:13:04.908 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:04.908 "is_configured": true, 00:13:04.908 "data_offset": 2048, 00:13:04.908 "data_size": 63488 00:13:04.908 } 00:13:04.908 ] 00:13:04.908 } 00:13:04.908 } 00:13:04.908 }' 00:13:04.908 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:05.168 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:05.168 pt2 00:13:05.168 pt3' 00:13:05.168 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:05.168 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:05.168 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:05.168 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:05.168 "name": "pt1", 00:13:05.168 "aliases": [ 00:13:05.168 "00000000-0000-0000-0000-000000000001" 00:13:05.168 ], 00:13:05.168 "product_name": "passthru", 00:13:05.168 "block_size": 512, 00:13:05.168 "num_blocks": 65536, 00:13:05.168 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:05.168 "assigned_rate_limits": { 00:13:05.168 "rw_ios_per_sec": 0, 00:13:05.168 "rw_mbytes_per_sec": 0, 00:13:05.168 "r_mbytes_per_sec": 0, 00:13:05.168 "w_mbytes_per_sec": 0 00:13:05.168 }, 00:13:05.168 "claimed": true, 00:13:05.168 "claim_type": "exclusive_write", 00:13:05.168 "zoned": false, 00:13:05.168 "supported_io_types": { 00:13:05.168 "read": true, 00:13:05.168 "write": true, 00:13:05.168 "unmap": true, 00:13:05.168 "flush": true, 00:13:05.168 "reset": true, 00:13:05.168 "nvme_admin": false, 00:13:05.168 "nvme_io": false, 00:13:05.168 "nvme_io_md": false, 00:13:05.168 "write_zeroes": true, 00:13:05.168 "zcopy": true, 00:13:05.168 "get_zone_info": false, 00:13:05.168 "zone_management": false, 00:13:05.168 "zone_append": false, 00:13:05.168 "compare": false, 00:13:05.168 "compare_and_write": false, 00:13:05.168 "abort": true, 00:13:05.168 "seek_hole": false, 00:13:05.168 "seek_data": false, 00:13:05.168 "copy": true, 00:13:05.168 "nvme_iov_md": false 00:13:05.168 }, 00:13:05.168 "memory_domains": [ 00:13:05.168 { 00:13:05.168 "dma_device_id": "system", 00:13:05.168 "dma_device_type": 1 00:13:05.168 }, 00:13:05.168 { 00:13:05.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.168 "dma_device_type": 2 00:13:05.168 } 00:13:05.168 ], 00:13:05.168 "driver_specific": { 00:13:05.168 "passthru": { 00:13:05.168 "name": "pt1", 00:13:05.168 "base_bdev_name": "malloc1" 00:13:05.168 } 00:13:05.168 } 00:13:05.168 }' 00:13:05.168 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.427 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.427 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:05.427 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.427 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.427 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:05.427 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.427 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.427 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:05.427 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.686 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.686 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:05.686 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:05.686 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:05.686 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:05.686 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:05.686 "name": "pt2", 00:13:05.686 "aliases": [ 00:13:05.686 "00000000-0000-0000-0000-000000000002" 00:13:05.686 ], 00:13:05.686 "product_name": "passthru", 00:13:05.686 "block_size": 512, 00:13:05.686 "num_blocks": 65536, 00:13:05.686 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:05.686 "assigned_rate_limits": { 00:13:05.686 "rw_ios_per_sec": 0, 00:13:05.686 "rw_mbytes_per_sec": 0, 00:13:05.686 "r_mbytes_per_sec": 0, 00:13:05.686 "w_mbytes_per_sec": 0 00:13:05.686 }, 00:13:05.686 "claimed": true, 00:13:05.686 "claim_type": "exclusive_write", 00:13:05.686 "zoned": false, 00:13:05.686 "supported_io_types": { 00:13:05.686 "read": true, 00:13:05.686 "write": true, 00:13:05.686 "unmap": true, 00:13:05.686 "flush": true, 00:13:05.686 "reset": true, 00:13:05.686 "nvme_admin": false, 00:13:05.686 "nvme_io": false, 00:13:05.686 "nvme_io_md": false, 00:13:05.686 "write_zeroes": true, 00:13:05.686 "zcopy": true, 00:13:05.686 "get_zone_info": false, 00:13:05.686 "zone_management": false, 00:13:05.686 "zone_append": false, 00:13:05.686 "compare": false, 00:13:05.686 "compare_and_write": false, 00:13:05.686 "abort": true, 00:13:05.686 "seek_hole": false, 00:13:05.686 "seek_data": false, 00:13:05.686 "copy": true, 00:13:05.686 "nvme_iov_md": false 00:13:05.686 }, 00:13:05.686 "memory_domains": [ 00:13:05.686 { 00:13:05.686 "dma_device_id": "system", 00:13:05.686 "dma_device_type": 1 00:13:05.686 }, 00:13:05.686 { 00:13:05.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.686 "dma_device_type": 2 00:13:05.686 } 00:13:05.686 ], 00:13:05.686 "driver_specific": { 00:13:05.686 "passthru": { 00:13:05.686 "name": "pt2", 00:13:05.686 "base_bdev_name": "malloc2" 00:13:05.686 } 00:13:05.686 } 00:13:05.686 }' 00:13:05.686 17:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.946 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.946 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:05.946 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.946 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.946 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:05.946 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.946 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.946 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:05.946 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.205 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.205 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:06.205 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:06.205 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:06.205 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:06.465 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:06.465 "name": "pt3", 00:13:06.465 "aliases": [ 00:13:06.465 "00000000-0000-0000-0000-000000000003" 00:13:06.465 ], 00:13:06.465 "product_name": "passthru", 00:13:06.465 "block_size": 512, 00:13:06.465 "num_blocks": 65536, 00:13:06.465 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:06.465 "assigned_rate_limits": { 00:13:06.465 "rw_ios_per_sec": 0, 00:13:06.465 "rw_mbytes_per_sec": 0, 00:13:06.465 "r_mbytes_per_sec": 0, 00:13:06.465 "w_mbytes_per_sec": 0 00:13:06.465 }, 00:13:06.465 "claimed": true, 00:13:06.465 "claim_type": "exclusive_write", 00:13:06.465 "zoned": false, 00:13:06.465 "supported_io_types": { 00:13:06.465 "read": true, 00:13:06.465 "write": true, 00:13:06.465 "unmap": true, 00:13:06.466 "flush": true, 00:13:06.466 "reset": true, 00:13:06.466 "nvme_admin": false, 00:13:06.466 "nvme_io": false, 00:13:06.466 "nvme_io_md": false, 00:13:06.466 "write_zeroes": true, 00:13:06.466 "zcopy": true, 00:13:06.466 "get_zone_info": false, 00:13:06.466 "zone_management": false, 00:13:06.466 "zone_append": false, 00:13:06.466 "compare": false, 00:13:06.466 "compare_and_write": false, 00:13:06.466 "abort": true, 00:13:06.466 "seek_hole": false, 00:13:06.466 "seek_data": false, 00:13:06.466 "copy": true, 00:13:06.466 "nvme_iov_md": false 00:13:06.466 }, 00:13:06.466 "memory_domains": [ 00:13:06.466 { 00:13:06.466 "dma_device_id": "system", 00:13:06.466 "dma_device_type": 1 00:13:06.466 }, 00:13:06.466 { 00:13:06.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.466 "dma_device_type": 2 00:13:06.466 } 00:13:06.466 ], 00:13:06.466 "driver_specific": { 00:13:06.466 "passthru": { 00:13:06.466 "name": "pt3", 00:13:06.466 "base_bdev_name": "malloc3" 00:13:06.466 } 00:13:06.466 } 00:13:06.466 }' 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:06.466 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.726 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.726 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:06.726 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:06.726 17:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:06.726 [2024-07-15 17:25:18.011767] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:06.985 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2f88ac5e-7064-40c5-976c-fee9cd641f5e 00:13:06.985 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 2f88ac5e-7064-40c5-976c-fee9cd641f5e ']' 00:13:06.985 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:06.986 [2024-07-15 17:25:18.188012] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:06.986 [2024-07-15 17:25:18.188027] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:06.986 [2024-07-15 17:25:18.188061] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:06.986 [2024-07-15 17:25:18.188099] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:06.986 [2024-07-15 17:25:18.188105] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228aa90 name raid_bdev1, state offline 00:13:06.986 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.986 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:07.246 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:07.246 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:07.246 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:07.246 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:07.506 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:07.506 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:07.506 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:07.506 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:07.765 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:07.765 17:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:08.025 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:08.026 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:08.330 [2024-07-15 17:25:19.326854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:08.330 [2024-07-15 17:25:19.327923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:08.330 [2024-07-15 17:25:19.327956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:08.330 [2024-07-15 17:25:19.327990] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:08.330 [2024-07-15 17:25:19.328017] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:08.330 [2024-07-15 17:25:19.328031] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:08.330 [2024-07-15 17:25:19.328041] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:08.330 [2024-07-15 17:25:19.328046] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2286bf0 name raid_bdev1, state configuring 00:13:08.330 request: 00:13:08.330 { 00:13:08.330 "name": "raid_bdev1", 00:13:08.330 "raid_level": "raid0", 00:13:08.330 "base_bdevs": [ 00:13:08.330 "malloc1", 00:13:08.330 "malloc2", 00:13:08.330 "malloc3" 00:13:08.330 ], 00:13:08.330 "strip_size_kb": 64, 00:13:08.330 "superblock": false, 00:13:08.330 "method": "bdev_raid_create", 00:13:08.330 "req_id": 1 00:13:08.330 } 00:13:08.330 Got JSON-RPC error response 00:13:08.330 response: 00:13:08.330 { 00:13:08.330 "code": -17, 00:13:08.330 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:08.330 } 00:13:08.330 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:08.330 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:08.330 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:08.330 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:08.330 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.330 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:08.330 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:08.330 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:08.330 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:08.591 [2024-07-15 17:25:19.711774] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:08.591 [2024-07-15 17:25:19.711794] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:08.591 [2024-07-15 17:25:19.711804] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20dee00 00:13:08.591 [2024-07-15 17:25:19.711810] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:08.591 [2024-07-15 17:25:19.713054] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:08.591 [2024-07-15 17:25:19.713073] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:08.591 [2024-07-15 17:25:19.713113] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:08.591 [2024-07-15 17:25:19.713131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:08.591 pt1 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.591 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:08.851 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.851 "name": "raid_bdev1", 00:13:08.851 "uuid": "2f88ac5e-7064-40c5-976c-fee9cd641f5e", 00:13:08.851 "strip_size_kb": 64, 00:13:08.851 "state": "configuring", 00:13:08.851 "raid_level": "raid0", 00:13:08.851 "superblock": true, 00:13:08.851 "num_base_bdevs": 3, 00:13:08.851 "num_base_bdevs_discovered": 1, 00:13:08.851 "num_base_bdevs_operational": 3, 00:13:08.851 "base_bdevs_list": [ 00:13:08.851 { 00:13:08.851 "name": "pt1", 00:13:08.851 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:08.851 "is_configured": true, 00:13:08.851 "data_offset": 2048, 00:13:08.851 "data_size": 63488 00:13:08.851 }, 00:13:08.851 { 00:13:08.851 "name": null, 00:13:08.851 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:08.851 "is_configured": false, 00:13:08.851 "data_offset": 2048, 00:13:08.851 "data_size": 63488 00:13:08.851 }, 00:13:08.851 { 00:13:08.851 "name": null, 00:13:08.851 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:08.851 "is_configured": false, 00:13:08.851 "data_offset": 2048, 00:13:08.851 "data_size": 63488 00:13:08.851 } 00:13:08.851 ] 00:13:08.851 }' 00:13:08.851 17:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.851 17:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.111 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:09.111 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:09.372 [2024-07-15 17:25:20.577969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:09.372 [2024-07-15 17:25:20.578002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.372 [2024-07-15 17:25:20.578014] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20ddc50 00:13:09.372 [2024-07-15 17:25:20.578020] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.372 [2024-07-15 17:25:20.578285] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.372 [2024-07-15 17:25:20.578296] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:09.372 [2024-07-15 17:25:20.578337] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:09.372 [2024-07-15 17:25:20.578350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:09.372 pt2 00:13:09.372 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:09.632 [2024-07-15 17:25:20.754413] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.632 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:09.891 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.891 "name": "raid_bdev1", 00:13:09.891 "uuid": "2f88ac5e-7064-40c5-976c-fee9cd641f5e", 00:13:09.891 "strip_size_kb": 64, 00:13:09.891 "state": "configuring", 00:13:09.891 "raid_level": "raid0", 00:13:09.891 "superblock": true, 00:13:09.891 "num_base_bdevs": 3, 00:13:09.891 "num_base_bdevs_discovered": 1, 00:13:09.891 "num_base_bdevs_operational": 3, 00:13:09.891 "base_bdevs_list": [ 00:13:09.891 { 00:13:09.891 "name": "pt1", 00:13:09.891 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:09.891 "is_configured": true, 00:13:09.891 "data_offset": 2048, 00:13:09.891 "data_size": 63488 00:13:09.891 }, 00:13:09.891 { 00:13:09.891 "name": null, 00:13:09.891 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:09.891 "is_configured": false, 00:13:09.891 "data_offset": 2048, 00:13:09.891 "data_size": 63488 00:13:09.891 }, 00:13:09.891 { 00:13:09.891 "name": null, 00:13:09.891 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:09.891 "is_configured": false, 00:13:09.891 "data_offset": 2048, 00:13:09.891 "data_size": 63488 00:13:09.891 } 00:13:09.891 ] 00:13:09.891 }' 00:13:09.891 17:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.891 17:25:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.461 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:10.461 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:10.461 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:10.461 [2024-07-15 17:25:21.668729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:10.461 [2024-07-15 17:25:21.668756] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:10.461 [2024-07-15 17:25:21.668766] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22893f0 00:13:10.461 [2024-07-15 17:25:21.668772] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:10.461 [2024-07-15 17:25:21.669026] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:10.461 [2024-07-15 17:25:21.669036] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:10.461 [2024-07-15 17:25:21.669074] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:10.461 [2024-07-15 17:25:21.669085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:10.461 pt2 00:13:10.461 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:10.461 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:10.461 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:10.721 [2024-07-15 17:25:21.865219] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:10.721 [2024-07-15 17:25:21.865236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:10.721 [2024-07-15 17:25:21.865244] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2287aa0 00:13:10.721 [2024-07-15 17:25:21.865250] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:10.721 [2024-07-15 17:25:21.865462] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:10.721 [2024-07-15 17:25:21.865472] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:10.721 [2024-07-15 17:25:21.865508] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:10.721 [2024-07-15 17:25:21.865519] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:10.721 [2024-07-15 17:25:21.865597] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2288100 00:13:10.721 [2024-07-15 17:25:21.865603] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:10.721 [2024-07-15 17:25:21.865740] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2159280 00:13:10.721 [2024-07-15 17:25:21.865837] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2288100 00:13:10.721 [2024-07-15 17:25:21.865842] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2288100 00:13:10.721 [2024-07-15 17:25:21.865912] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:10.721 pt3 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.721 17:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:10.982 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.982 "name": "raid_bdev1", 00:13:10.982 "uuid": "2f88ac5e-7064-40c5-976c-fee9cd641f5e", 00:13:10.982 "strip_size_kb": 64, 00:13:10.982 "state": "online", 00:13:10.982 "raid_level": "raid0", 00:13:10.982 "superblock": true, 00:13:10.982 "num_base_bdevs": 3, 00:13:10.982 "num_base_bdevs_discovered": 3, 00:13:10.982 "num_base_bdevs_operational": 3, 00:13:10.982 "base_bdevs_list": [ 00:13:10.982 { 00:13:10.982 "name": "pt1", 00:13:10.982 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:10.982 "is_configured": true, 00:13:10.982 "data_offset": 2048, 00:13:10.982 "data_size": 63488 00:13:10.982 }, 00:13:10.982 { 00:13:10.982 "name": "pt2", 00:13:10.982 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:10.982 "is_configured": true, 00:13:10.982 "data_offset": 2048, 00:13:10.982 "data_size": 63488 00:13:10.982 }, 00:13:10.982 { 00:13:10.982 "name": "pt3", 00:13:10.982 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:10.982 "is_configured": true, 00:13:10.982 "data_offset": 2048, 00:13:10.982 "data_size": 63488 00:13:10.982 } 00:13:10.982 ] 00:13:10.982 }' 00:13:10.982 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.982 17:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:11.553 [2024-07-15 17:25:22.807819] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:11.553 "name": "raid_bdev1", 00:13:11.553 "aliases": [ 00:13:11.553 "2f88ac5e-7064-40c5-976c-fee9cd641f5e" 00:13:11.553 ], 00:13:11.553 "product_name": "Raid Volume", 00:13:11.553 "block_size": 512, 00:13:11.553 "num_blocks": 190464, 00:13:11.553 "uuid": "2f88ac5e-7064-40c5-976c-fee9cd641f5e", 00:13:11.553 "assigned_rate_limits": { 00:13:11.553 "rw_ios_per_sec": 0, 00:13:11.553 "rw_mbytes_per_sec": 0, 00:13:11.553 "r_mbytes_per_sec": 0, 00:13:11.553 "w_mbytes_per_sec": 0 00:13:11.553 }, 00:13:11.553 "claimed": false, 00:13:11.553 "zoned": false, 00:13:11.553 "supported_io_types": { 00:13:11.553 "read": true, 00:13:11.553 "write": true, 00:13:11.553 "unmap": true, 00:13:11.553 "flush": true, 00:13:11.553 "reset": true, 00:13:11.553 "nvme_admin": false, 00:13:11.553 "nvme_io": false, 00:13:11.553 "nvme_io_md": false, 00:13:11.553 "write_zeroes": true, 00:13:11.553 "zcopy": false, 00:13:11.553 "get_zone_info": false, 00:13:11.553 "zone_management": false, 00:13:11.553 "zone_append": false, 00:13:11.553 "compare": false, 00:13:11.553 "compare_and_write": false, 00:13:11.553 "abort": false, 00:13:11.553 "seek_hole": false, 00:13:11.553 "seek_data": false, 00:13:11.553 "copy": false, 00:13:11.553 "nvme_iov_md": false 00:13:11.553 }, 00:13:11.553 "memory_domains": [ 00:13:11.553 { 00:13:11.553 "dma_device_id": "system", 00:13:11.553 "dma_device_type": 1 00:13:11.553 }, 00:13:11.553 { 00:13:11.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.553 "dma_device_type": 2 00:13:11.553 }, 00:13:11.553 { 00:13:11.553 "dma_device_id": "system", 00:13:11.553 "dma_device_type": 1 00:13:11.553 }, 00:13:11.553 { 00:13:11.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.553 "dma_device_type": 2 00:13:11.553 }, 00:13:11.553 { 00:13:11.553 "dma_device_id": "system", 00:13:11.553 "dma_device_type": 1 00:13:11.553 }, 00:13:11.553 { 00:13:11.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.553 "dma_device_type": 2 00:13:11.553 } 00:13:11.553 ], 00:13:11.553 "driver_specific": { 00:13:11.553 "raid": { 00:13:11.553 "uuid": "2f88ac5e-7064-40c5-976c-fee9cd641f5e", 00:13:11.553 "strip_size_kb": 64, 00:13:11.553 "state": "online", 00:13:11.553 "raid_level": "raid0", 00:13:11.553 "superblock": true, 00:13:11.553 "num_base_bdevs": 3, 00:13:11.553 "num_base_bdevs_discovered": 3, 00:13:11.553 "num_base_bdevs_operational": 3, 00:13:11.553 "base_bdevs_list": [ 00:13:11.553 { 00:13:11.553 "name": "pt1", 00:13:11.553 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:11.553 "is_configured": true, 00:13:11.553 "data_offset": 2048, 00:13:11.553 "data_size": 63488 00:13:11.553 }, 00:13:11.553 { 00:13:11.553 "name": "pt2", 00:13:11.553 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:11.553 "is_configured": true, 00:13:11.553 "data_offset": 2048, 00:13:11.553 "data_size": 63488 00:13:11.553 }, 00:13:11.553 { 00:13:11.553 "name": "pt3", 00:13:11.553 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:11.553 "is_configured": true, 00:13:11.553 "data_offset": 2048, 00:13:11.553 "data_size": 63488 00:13:11.553 } 00:13:11.553 ] 00:13:11.553 } 00:13:11.553 } 00:13:11.553 }' 00:13:11.553 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:11.813 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:11.813 pt2 00:13:11.813 pt3' 00:13:11.813 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:11.813 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:11.813 17:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:11.813 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:11.813 "name": "pt1", 00:13:11.813 "aliases": [ 00:13:11.813 "00000000-0000-0000-0000-000000000001" 00:13:11.813 ], 00:13:11.813 "product_name": "passthru", 00:13:11.813 "block_size": 512, 00:13:11.813 "num_blocks": 65536, 00:13:11.813 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:11.813 "assigned_rate_limits": { 00:13:11.813 "rw_ios_per_sec": 0, 00:13:11.813 "rw_mbytes_per_sec": 0, 00:13:11.813 "r_mbytes_per_sec": 0, 00:13:11.813 "w_mbytes_per_sec": 0 00:13:11.813 }, 00:13:11.813 "claimed": true, 00:13:11.813 "claim_type": "exclusive_write", 00:13:11.813 "zoned": false, 00:13:11.813 "supported_io_types": { 00:13:11.813 "read": true, 00:13:11.813 "write": true, 00:13:11.813 "unmap": true, 00:13:11.813 "flush": true, 00:13:11.813 "reset": true, 00:13:11.813 "nvme_admin": false, 00:13:11.813 "nvme_io": false, 00:13:11.813 "nvme_io_md": false, 00:13:11.813 "write_zeroes": true, 00:13:11.813 "zcopy": true, 00:13:11.813 "get_zone_info": false, 00:13:11.813 "zone_management": false, 00:13:11.813 "zone_append": false, 00:13:11.813 "compare": false, 00:13:11.813 "compare_and_write": false, 00:13:11.813 "abort": true, 00:13:11.813 "seek_hole": false, 00:13:11.813 "seek_data": false, 00:13:11.813 "copy": true, 00:13:11.813 "nvme_iov_md": false 00:13:11.813 }, 00:13:11.813 "memory_domains": [ 00:13:11.813 { 00:13:11.813 "dma_device_id": "system", 00:13:11.813 "dma_device_type": 1 00:13:11.813 }, 00:13:11.813 { 00:13:11.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.813 "dma_device_type": 2 00:13:11.813 } 00:13:11.813 ], 00:13:11.813 "driver_specific": { 00:13:11.813 "passthru": { 00:13:11.813 "name": "pt1", 00:13:11.813 "base_bdev_name": "malloc1" 00:13:11.813 } 00:13:11.813 } 00:13:11.813 }' 00:13:11.813 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.093 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.359 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:12.359 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:12.359 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:12.359 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:12.359 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:12.359 "name": "pt2", 00:13:12.359 "aliases": [ 00:13:12.359 "00000000-0000-0000-0000-000000000002" 00:13:12.359 ], 00:13:12.359 "product_name": "passthru", 00:13:12.359 "block_size": 512, 00:13:12.359 "num_blocks": 65536, 00:13:12.359 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:12.359 "assigned_rate_limits": { 00:13:12.359 "rw_ios_per_sec": 0, 00:13:12.359 "rw_mbytes_per_sec": 0, 00:13:12.359 "r_mbytes_per_sec": 0, 00:13:12.359 "w_mbytes_per_sec": 0 00:13:12.359 }, 00:13:12.359 "claimed": true, 00:13:12.359 "claim_type": "exclusive_write", 00:13:12.359 "zoned": false, 00:13:12.359 "supported_io_types": { 00:13:12.359 "read": true, 00:13:12.359 "write": true, 00:13:12.359 "unmap": true, 00:13:12.359 "flush": true, 00:13:12.359 "reset": true, 00:13:12.359 "nvme_admin": false, 00:13:12.359 "nvme_io": false, 00:13:12.359 "nvme_io_md": false, 00:13:12.359 "write_zeroes": true, 00:13:12.359 "zcopy": true, 00:13:12.359 "get_zone_info": false, 00:13:12.359 "zone_management": false, 00:13:12.359 "zone_append": false, 00:13:12.359 "compare": false, 00:13:12.359 "compare_and_write": false, 00:13:12.359 "abort": true, 00:13:12.359 "seek_hole": false, 00:13:12.359 "seek_data": false, 00:13:12.359 "copy": true, 00:13:12.359 "nvme_iov_md": false 00:13:12.359 }, 00:13:12.359 "memory_domains": [ 00:13:12.359 { 00:13:12.359 "dma_device_id": "system", 00:13:12.359 "dma_device_type": 1 00:13:12.359 }, 00:13:12.359 { 00:13:12.359 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.359 "dma_device_type": 2 00:13:12.359 } 00:13:12.359 ], 00:13:12.359 "driver_specific": { 00:13:12.359 "passthru": { 00:13:12.359 "name": "pt2", 00:13:12.359 "base_bdev_name": "malloc2" 00:13:12.359 } 00:13:12.359 } 00:13:12.359 }' 00:13:12.359 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:12.618 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:12.619 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:12.619 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.619 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.619 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:12.619 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.619 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.619 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:12.619 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.878 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.878 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:12.878 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:12.878 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:12.878 17:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:12.878 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:12.878 "name": "pt3", 00:13:12.878 "aliases": [ 00:13:12.878 "00000000-0000-0000-0000-000000000003" 00:13:12.878 ], 00:13:12.878 "product_name": "passthru", 00:13:12.878 "block_size": 512, 00:13:12.878 "num_blocks": 65536, 00:13:12.878 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:12.879 "assigned_rate_limits": { 00:13:12.879 "rw_ios_per_sec": 0, 00:13:12.879 "rw_mbytes_per_sec": 0, 00:13:12.879 "r_mbytes_per_sec": 0, 00:13:12.879 "w_mbytes_per_sec": 0 00:13:12.879 }, 00:13:12.879 "claimed": true, 00:13:12.879 "claim_type": "exclusive_write", 00:13:12.879 "zoned": false, 00:13:12.879 "supported_io_types": { 00:13:12.879 "read": true, 00:13:12.879 "write": true, 00:13:12.879 "unmap": true, 00:13:12.879 "flush": true, 00:13:12.879 "reset": true, 00:13:12.879 "nvme_admin": false, 00:13:12.879 "nvme_io": false, 00:13:12.879 "nvme_io_md": false, 00:13:12.879 "write_zeroes": true, 00:13:12.879 "zcopy": true, 00:13:12.879 "get_zone_info": false, 00:13:12.879 "zone_management": false, 00:13:12.879 "zone_append": false, 00:13:12.879 "compare": false, 00:13:12.879 "compare_and_write": false, 00:13:12.879 "abort": true, 00:13:12.879 "seek_hole": false, 00:13:12.879 "seek_data": false, 00:13:12.879 "copy": true, 00:13:12.879 "nvme_iov_md": false 00:13:12.879 }, 00:13:12.879 "memory_domains": [ 00:13:12.879 { 00:13:12.879 "dma_device_id": "system", 00:13:12.879 "dma_device_type": 1 00:13:12.879 }, 00:13:12.879 { 00:13:12.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.879 "dma_device_type": 2 00:13:12.879 } 00:13:12.879 ], 00:13:12.879 "driver_specific": { 00:13:12.879 "passthru": { 00:13:12.879 "name": "pt3", 00:13:12.879 "base_bdev_name": "malloc3" 00:13:12.879 } 00:13:12.879 } 00:13:12.879 }' 00:13:12.879 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:13.138 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:13.398 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:13.398 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:13.398 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:13.398 [2024-07-15 17:25:24.652475] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 2f88ac5e-7064-40c5-976c-fee9cd641f5e '!=' 2f88ac5e-7064-40c5-976c-fee9cd641f5e ']' 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2772528 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2772528 ']' 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2772528 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:13.399 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2772528 00:13:13.659 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:13.660 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:13.660 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2772528' 00:13:13.660 killing process with pid 2772528 00:13:13.660 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2772528 00:13:13.660 [2024-07-15 17:25:24.722956] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:13.660 [2024-07-15 17:25:24.722993] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:13.660 [2024-07-15 17:25:24.723029] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:13.660 [2024-07-15 17:25:24.723035] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2288100 name raid_bdev1, state offline 00:13:13.660 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2772528 00:13:13.660 [2024-07-15 17:25:24.737962] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:13.660 17:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:13.660 00:13:13.660 real 0m11.775s 00:13:13.660 user 0m21.641s 00:13:13.660 sys 0m1.781s 00:13:13.660 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:13.660 17:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.660 ************************************ 00:13:13.660 END TEST raid_superblock_test 00:13:13.660 ************************************ 00:13:13.660 17:25:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:13.660 17:25:24 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:13:13.660 17:25:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:13.660 17:25:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:13.660 17:25:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:13.660 ************************************ 00:13:13.660 START TEST raid_read_error_test 00:13:13.660 ************************************ 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.l6GpZWGStS 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2774946 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2774946 /var/tmp/spdk-raid.sock 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2774946 ']' 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:13.660 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:13.660 17:25:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.921 [2024-07-15 17:25:24.998537] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:13:13.921 [2024-07-15 17:25:24.998590] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2774946 ] 00:13:13.921 [2024-07-15 17:25:25.086755] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:13.921 [2024-07-15 17:25:25.154538] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:13.921 [2024-07-15 17:25:25.195626] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:13.921 [2024-07-15 17:25:25.195651] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:14.859 17:25:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:14.859 17:25:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:14.859 17:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:14.859 17:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:14.859 BaseBdev1_malloc 00:13:14.859 17:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:15.117 true 00:13:15.117 17:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:15.117 [2024-07-15 17:25:26.358253] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:15.117 [2024-07-15 17:25:26.358283] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:15.118 [2024-07-15 17:25:26.358294] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x121fb50 00:13:15.118 [2024-07-15 17:25:26.358300] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:15.118 [2024-07-15 17:25:26.359602] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:15.118 [2024-07-15 17:25:26.359626] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:15.118 BaseBdev1 00:13:15.118 17:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:15.118 17:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:15.377 BaseBdev2_malloc 00:13:15.377 17:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:15.636 true 00:13:15.636 17:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:15.636 [2024-07-15 17:25:26.917610] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:15.636 [2024-07-15 17:25:26.917638] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:15.637 [2024-07-15 17:25:26.917649] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1203ea0 00:13:15.637 [2024-07-15 17:25:26.917655] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:15.637 [2024-07-15 17:25:26.918844] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:15.637 [2024-07-15 17:25:26.918863] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:15.637 BaseBdev2 00:13:15.637 17:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:15.637 17:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:15.896 BaseBdev3_malloc 00:13:15.896 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:16.156 true 00:13:16.156 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:16.416 [2024-07-15 17:25:27.484933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:16.416 [2024-07-15 17:25:27.484960] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:16.416 [2024-07-15 17:25:27.484971] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1207fb0 00:13:16.416 [2024-07-15 17:25:27.484977] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:16.416 [2024-07-15 17:25:27.486143] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:16.416 [2024-07-15 17:25:27.486161] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:16.416 BaseBdev3 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:16.416 [2024-07-15 17:25:27.689469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:16.416 [2024-07-15 17:25:27.690498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:16.416 [2024-07-15 17:25:27.690549] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:16.416 [2024-07-15 17:25:27.690701] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12090e0 00:13:16.416 [2024-07-15 17:25:27.690715] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:16.416 [2024-07-15 17:25:27.690857] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x106b210 00:13:16.416 [2024-07-15 17:25:27.690970] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12090e0 00:13:16.416 [2024-07-15 17:25:27.690976] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12090e0 00:13:16.416 [2024-07-15 17:25:27.691054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:16.416 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.677 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.677 "name": "raid_bdev1", 00:13:16.677 "uuid": "ae1e788f-efd4-46f5-8c77-86e7f073b17f", 00:13:16.677 "strip_size_kb": 64, 00:13:16.677 "state": "online", 00:13:16.677 "raid_level": "raid0", 00:13:16.677 "superblock": true, 00:13:16.677 "num_base_bdevs": 3, 00:13:16.677 "num_base_bdevs_discovered": 3, 00:13:16.677 "num_base_bdevs_operational": 3, 00:13:16.677 "base_bdevs_list": [ 00:13:16.677 { 00:13:16.677 "name": "BaseBdev1", 00:13:16.677 "uuid": "7fe1aab2-87e2-55fe-94cd-d8bac9535435", 00:13:16.677 "is_configured": true, 00:13:16.677 "data_offset": 2048, 00:13:16.677 "data_size": 63488 00:13:16.677 }, 00:13:16.677 { 00:13:16.677 "name": "BaseBdev2", 00:13:16.677 "uuid": "c8372017-6248-5b9a-9a3e-413538e9279c", 00:13:16.677 "is_configured": true, 00:13:16.677 "data_offset": 2048, 00:13:16.677 "data_size": 63488 00:13:16.677 }, 00:13:16.677 { 00:13:16.677 "name": "BaseBdev3", 00:13:16.677 "uuid": "bfe51a4c-5c95-593b-a589-0aaed411e640", 00:13:16.677 "is_configured": true, 00:13:16.677 "data_offset": 2048, 00:13:16.677 "data_size": 63488 00:13:16.677 } 00:13:16.677 ] 00:13:16.677 }' 00:13:16.677 17:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.677 17:25:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.247 17:25:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:17.247 17:25:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:17.247 [2024-07-15 17:25:28.499734] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1208da0 00:13:18.189 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.449 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:18.710 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:18.710 "name": "raid_bdev1", 00:13:18.710 "uuid": "ae1e788f-efd4-46f5-8c77-86e7f073b17f", 00:13:18.710 "strip_size_kb": 64, 00:13:18.710 "state": "online", 00:13:18.710 "raid_level": "raid0", 00:13:18.710 "superblock": true, 00:13:18.710 "num_base_bdevs": 3, 00:13:18.710 "num_base_bdevs_discovered": 3, 00:13:18.710 "num_base_bdevs_operational": 3, 00:13:18.710 "base_bdevs_list": [ 00:13:18.710 { 00:13:18.710 "name": "BaseBdev1", 00:13:18.710 "uuid": "7fe1aab2-87e2-55fe-94cd-d8bac9535435", 00:13:18.710 "is_configured": true, 00:13:18.710 "data_offset": 2048, 00:13:18.710 "data_size": 63488 00:13:18.710 }, 00:13:18.710 { 00:13:18.710 "name": "BaseBdev2", 00:13:18.710 "uuid": "c8372017-6248-5b9a-9a3e-413538e9279c", 00:13:18.710 "is_configured": true, 00:13:18.710 "data_offset": 2048, 00:13:18.710 "data_size": 63488 00:13:18.710 }, 00:13:18.710 { 00:13:18.710 "name": "BaseBdev3", 00:13:18.710 "uuid": "bfe51a4c-5c95-593b-a589-0aaed411e640", 00:13:18.710 "is_configured": true, 00:13:18.710 "data_offset": 2048, 00:13:18.710 "data_size": 63488 00:13:18.710 } 00:13:18.710 ] 00:13:18.710 }' 00:13:18.710 17:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:18.710 17:25:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.300 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:19.300 [2024-07-15 17:25:30.534678] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:19.300 [2024-07-15 17:25:30.534708] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:19.300 [2024-07-15 17:25:30.537293] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:19.300 [2024-07-15 17:25:30.537319] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:19.300 [2024-07-15 17:25:30.537342] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:19.300 [2024-07-15 17:25:30.537348] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12090e0 name raid_bdev1, state offline 00:13:19.300 0 00:13:19.300 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2774946 00:13:19.300 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2774946 ']' 00:13:19.300 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2774946 00:13:19.300 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:19.300 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:19.300 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2774946 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2774946' 00:13:19.560 killing process with pid 2774946 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2774946 00:13:19.560 [2024-07-15 17:25:30.604118] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2774946 00:13:19.560 [2024-07-15 17:25:30.615290] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.l6GpZWGStS 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:13:19.560 00:13:19.560 real 0m5.818s 00:13:19.560 user 0m9.235s 00:13:19.560 sys 0m0.847s 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:19.560 17:25:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.560 ************************************ 00:13:19.560 END TEST raid_read_error_test 00:13:19.560 ************************************ 00:13:19.560 17:25:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:19.560 17:25:30 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:13:19.560 17:25:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:19.560 17:25:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:19.560 17:25:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:19.560 ************************************ 00:13:19.560 START TEST raid_write_error_test 00:13:19.560 ************************************ 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.npQ01Ggwkm 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2775962 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2775962 /var/tmp/spdk-raid.sock 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:19.560 17:25:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2775962 ']' 00:13:19.561 17:25:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:19.561 17:25:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:19.561 17:25:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:19.561 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:19.561 17:25:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:19.561 17:25:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.820 [2024-07-15 17:25:30.895281] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:13:19.820 [2024-07-15 17:25:30.895345] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2775962 ] 00:13:19.820 [2024-07-15 17:25:30.988149] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:19.820 [2024-07-15 17:25:31.066440] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:19.820 [2024-07-15 17:25:31.107853] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:19.820 [2024-07-15 17:25:31.107891] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:20.758 17:25:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:20.758 17:25:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:20.758 17:25:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:20.758 17:25:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:20.758 BaseBdev1_malloc 00:13:20.758 17:25:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:21.018 true 00:13:21.018 17:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:21.018 [2024-07-15 17:25:32.299396] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:21.018 [2024-07-15 17:25:32.299428] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:21.018 [2024-07-15 17:25:32.299438] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x143fb50 00:13:21.018 [2024-07-15 17:25:32.299445] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:21.018 [2024-07-15 17:25:32.300730] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:21.018 [2024-07-15 17:25:32.300749] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:21.018 BaseBdev1 00:13:21.277 17:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:21.277 17:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:21.277 BaseBdev2_malloc 00:13:21.277 17:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:21.537 true 00:13:21.537 17:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:21.797 [2024-07-15 17:25:32.858483] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:21.797 [2024-07-15 17:25:32.858512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:21.797 [2024-07-15 17:25:32.858523] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1423ea0 00:13:21.797 [2024-07-15 17:25:32.858529] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:21.797 [2024-07-15 17:25:32.859670] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:21.797 [2024-07-15 17:25:32.859689] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:21.797 BaseBdev2 00:13:21.797 17:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:21.797 17:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:21.797 BaseBdev3_malloc 00:13:21.797 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:22.056 true 00:13:22.056 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:22.317 [2024-07-15 17:25:33.413621] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:22.317 [2024-07-15 17:25:33.413650] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:22.317 [2024-07-15 17:25:33.413661] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1427fb0 00:13:22.317 [2024-07-15 17:25:33.413668] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:22.317 [2024-07-15 17:25:33.414873] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:22.317 [2024-07-15 17:25:33.414893] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:22.317 BaseBdev3 00:13:22.317 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:22.317 [2024-07-15 17:25:33.602123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:22.317 [2024-07-15 17:25:33.603161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:22.317 [2024-07-15 17:25:33.603216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:22.317 [2024-07-15 17:25:33.603368] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14290e0 00:13:22.317 [2024-07-15 17:25:33.603376] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:22.317 [2024-07-15 17:25:33.603525] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x128b210 00:13:22.317 [2024-07-15 17:25:33.603640] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14290e0 00:13:22.317 [2024-07-15 17:25:33.603646] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14290e0 00:13:22.317 [2024-07-15 17:25:33.603728] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.577 "name": "raid_bdev1", 00:13:22.577 "uuid": "22194cdb-446b-4b51-9844-55735175d96c", 00:13:22.577 "strip_size_kb": 64, 00:13:22.577 "state": "online", 00:13:22.577 "raid_level": "raid0", 00:13:22.577 "superblock": true, 00:13:22.577 "num_base_bdevs": 3, 00:13:22.577 "num_base_bdevs_discovered": 3, 00:13:22.577 "num_base_bdevs_operational": 3, 00:13:22.577 "base_bdevs_list": [ 00:13:22.577 { 00:13:22.577 "name": "BaseBdev1", 00:13:22.577 "uuid": "68330629-bd88-58e2-aaa3-d5d3eda503e4", 00:13:22.577 "is_configured": true, 00:13:22.577 "data_offset": 2048, 00:13:22.577 "data_size": 63488 00:13:22.577 }, 00:13:22.577 { 00:13:22.577 "name": "BaseBdev2", 00:13:22.577 "uuid": "17c3791d-512e-5542-b7d8-37cb8cbca614", 00:13:22.577 "is_configured": true, 00:13:22.577 "data_offset": 2048, 00:13:22.577 "data_size": 63488 00:13:22.577 }, 00:13:22.577 { 00:13:22.577 "name": "BaseBdev3", 00:13:22.577 "uuid": "6bdc5291-40a9-5a97-abb4-035be0c003eb", 00:13:22.577 "is_configured": true, 00:13:22.577 "data_offset": 2048, 00:13:22.577 "data_size": 63488 00:13:22.577 } 00:13:22.577 ] 00:13:22.577 }' 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.577 17:25:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.174 17:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:23.174 17:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:23.174 [2024-07-15 17:25:34.468517] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1428da0 00:13:24.113 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:24.375 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:24.375 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:24.375 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:24.375 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:24.375 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:24.375 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:24.376 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:24.376 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:24.376 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:24.376 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.376 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.376 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.376 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.376 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.376 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:24.636 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.636 "name": "raid_bdev1", 00:13:24.636 "uuid": "22194cdb-446b-4b51-9844-55735175d96c", 00:13:24.636 "strip_size_kb": 64, 00:13:24.636 "state": "online", 00:13:24.636 "raid_level": "raid0", 00:13:24.636 "superblock": true, 00:13:24.636 "num_base_bdevs": 3, 00:13:24.636 "num_base_bdevs_discovered": 3, 00:13:24.636 "num_base_bdevs_operational": 3, 00:13:24.636 "base_bdevs_list": [ 00:13:24.636 { 00:13:24.636 "name": "BaseBdev1", 00:13:24.636 "uuid": "68330629-bd88-58e2-aaa3-d5d3eda503e4", 00:13:24.636 "is_configured": true, 00:13:24.636 "data_offset": 2048, 00:13:24.637 "data_size": 63488 00:13:24.637 }, 00:13:24.637 { 00:13:24.637 "name": "BaseBdev2", 00:13:24.637 "uuid": "17c3791d-512e-5542-b7d8-37cb8cbca614", 00:13:24.637 "is_configured": true, 00:13:24.637 "data_offset": 2048, 00:13:24.637 "data_size": 63488 00:13:24.637 }, 00:13:24.637 { 00:13:24.637 "name": "BaseBdev3", 00:13:24.637 "uuid": "6bdc5291-40a9-5a97-abb4-035be0c003eb", 00:13:24.637 "is_configured": true, 00:13:24.637 "data_offset": 2048, 00:13:24.637 "data_size": 63488 00:13:24.637 } 00:13:24.637 ] 00:13:24.637 }' 00:13:24.637 17:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.637 17:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.206 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:25.467 [2024-07-15 17:25:36.506183] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:25.467 [2024-07-15 17:25:36.506212] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:25.467 [2024-07-15 17:25:36.508866] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:25.467 [2024-07-15 17:25:36.508893] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:25.467 [2024-07-15 17:25:36.508917] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:25.467 [2024-07-15 17:25:36.508923] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14290e0 name raid_bdev1, state offline 00:13:25.467 0 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2775962 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2775962 ']' 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2775962 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2775962 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2775962' 00:13:25.467 killing process with pid 2775962 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2775962 00:13:25.467 [2024-07-15 17:25:36.592352] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2775962 00:13:25.467 [2024-07-15 17:25:36.603515] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.npQ01Ggwkm 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:13:25.467 00:13:25.467 real 0m5.910s 00:13:25.467 user 0m9.404s 00:13:25.467 sys 0m0.863s 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:25.467 17:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.467 ************************************ 00:13:25.467 END TEST raid_write_error_test 00:13:25.467 ************************************ 00:13:25.728 17:25:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:25.728 17:25:36 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:25.728 17:25:36 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:13:25.728 17:25:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:25.728 17:25:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:25.728 17:25:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:25.728 ************************************ 00:13:25.728 START TEST raid_state_function_test 00:13:25.728 ************************************ 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2777002 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2777002' 00:13:25.728 Process raid pid: 2777002 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2777002 /var/tmp/spdk-raid.sock 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2777002 ']' 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:25.728 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:25.728 17:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.728 [2024-07-15 17:25:36.878925] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:13:25.728 [2024-07-15 17:25:36.879005] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:25.728 [2024-07-15 17:25:36.972034] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:25.988 [2024-07-15 17:25:37.040857] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:25.988 [2024-07-15 17:25:37.087808] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:25.988 [2024-07-15 17:25:37.087832] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:26.559 17:25:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:26.559 17:25:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:26.560 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:26.820 [2024-07-15 17:25:37.871313] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:26.820 [2024-07-15 17:25:37.871339] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:26.820 [2024-07-15 17:25:37.871346] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:26.820 [2024-07-15 17:25:37.871352] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:26.820 [2024-07-15 17:25:37.871356] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:26.820 [2024-07-15 17:25:37.871362] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.820 17:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.820 17:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.820 "name": "Existed_Raid", 00:13:26.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.820 "strip_size_kb": 64, 00:13:26.820 "state": "configuring", 00:13:26.820 "raid_level": "concat", 00:13:26.820 "superblock": false, 00:13:26.820 "num_base_bdevs": 3, 00:13:26.820 "num_base_bdevs_discovered": 0, 00:13:26.820 "num_base_bdevs_operational": 3, 00:13:26.820 "base_bdevs_list": [ 00:13:26.820 { 00:13:26.820 "name": "BaseBdev1", 00:13:26.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.820 "is_configured": false, 00:13:26.820 "data_offset": 0, 00:13:26.820 "data_size": 0 00:13:26.820 }, 00:13:26.820 { 00:13:26.820 "name": "BaseBdev2", 00:13:26.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.820 "is_configured": false, 00:13:26.820 "data_offset": 0, 00:13:26.820 "data_size": 0 00:13:26.820 }, 00:13:26.820 { 00:13:26.820 "name": "BaseBdev3", 00:13:26.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.820 "is_configured": false, 00:13:26.820 "data_offset": 0, 00:13:26.820 "data_size": 0 00:13:26.820 } 00:13:26.820 ] 00:13:26.820 }' 00:13:26.820 17:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.820 17:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.389 17:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:27.649 [2024-07-15 17:25:38.777489] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:27.649 [2024-07-15 17:25:38.777506] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b26d0 name Existed_Raid, state configuring 00:13:27.649 17:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:27.908 [2024-07-15 17:25:38.966070] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:27.908 [2024-07-15 17:25:38.966090] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:27.908 [2024-07-15 17:25:38.966096] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:27.908 [2024-07-15 17:25:38.966102] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:27.908 [2024-07-15 17:25:38.966106] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:27.908 [2024-07-15 17:25:38.966111] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:27.908 17:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:27.908 [2024-07-15 17:25:39.161290] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:27.908 BaseBdev1 00:13:27.908 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:27.908 17:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:27.908 17:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:27.908 17:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:27.908 17:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:27.908 17:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:27.908 17:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:28.168 17:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:28.428 [ 00:13:28.428 { 00:13:28.428 "name": "BaseBdev1", 00:13:28.428 "aliases": [ 00:13:28.428 "cb9717e2-66ac-4de7-a8fc-784e9f68c2a9" 00:13:28.428 ], 00:13:28.428 "product_name": "Malloc disk", 00:13:28.428 "block_size": 512, 00:13:28.428 "num_blocks": 65536, 00:13:28.428 "uuid": "cb9717e2-66ac-4de7-a8fc-784e9f68c2a9", 00:13:28.428 "assigned_rate_limits": { 00:13:28.428 "rw_ios_per_sec": 0, 00:13:28.428 "rw_mbytes_per_sec": 0, 00:13:28.428 "r_mbytes_per_sec": 0, 00:13:28.428 "w_mbytes_per_sec": 0 00:13:28.428 }, 00:13:28.428 "claimed": true, 00:13:28.428 "claim_type": "exclusive_write", 00:13:28.428 "zoned": false, 00:13:28.428 "supported_io_types": { 00:13:28.428 "read": true, 00:13:28.428 "write": true, 00:13:28.428 "unmap": true, 00:13:28.428 "flush": true, 00:13:28.428 "reset": true, 00:13:28.428 "nvme_admin": false, 00:13:28.428 "nvme_io": false, 00:13:28.428 "nvme_io_md": false, 00:13:28.428 "write_zeroes": true, 00:13:28.428 "zcopy": true, 00:13:28.428 "get_zone_info": false, 00:13:28.428 "zone_management": false, 00:13:28.428 "zone_append": false, 00:13:28.428 "compare": false, 00:13:28.428 "compare_and_write": false, 00:13:28.428 "abort": true, 00:13:28.428 "seek_hole": false, 00:13:28.428 "seek_data": false, 00:13:28.428 "copy": true, 00:13:28.428 "nvme_iov_md": false 00:13:28.428 }, 00:13:28.428 "memory_domains": [ 00:13:28.428 { 00:13:28.428 "dma_device_id": "system", 00:13:28.428 "dma_device_type": 1 00:13:28.428 }, 00:13:28.428 { 00:13:28.428 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:28.428 "dma_device_type": 2 00:13:28.428 } 00:13:28.428 ], 00:13:28.428 "driver_specific": {} 00:13:28.428 } 00:13:28.428 ] 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.428 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.688 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.688 "name": "Existed_Raid", 00:13:28.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.688 "strip_size_kb": 64, 00:13:28.688 "state": "configuring", 00:13:28.688 "raid_level": "concat", 00:13:28.688 "superblock": false, 00:13:28.688 "num_base_bdevs": 3, 00:13:28.688 "num_base_bdevs_discovered": 1, 00:13:28.688 "num_base_bdevs_operational": 3, 00:13:28.688 "base_bdevs_list": [ 00:13:28.688 { 00:13:28.688 "name": "BaseBdev1", 00:13:28.688 "uuid": "cb9717e2-66ac-4de7-a8fc-784e9f68c2a9", 00:13:28.688 "is_configured": true, 00:13:28.688 "data_offset": 0, 00:13:28.688 "data_size": 65536 00:13:28.688 }, 00:13:28.688 { 00:13:28.688 "name": "BaseBdev2", 00:13:28.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.688 "is_configured": false, 00:13:28.688 "data_offset": 0, 00:13:28.688 "data_size": 0 00:13:28.688 }, 00:13:28.688 { 00:13:28.688 "name": "BaseBdev3", 00:13:28.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.688 "is_configured": false, 00:13:28.688 "data_offset": 0, 00:13:28.688 "data_size": 0 00:13:28.688 } 00:13:28.688 ] 00:13:28.688 }' 00:13:28.688 17:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.688 17:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.259 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:29.259 [2024-07-15 17:25:40.460589] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:29.259 [2024-07-15 17:25:40.460617] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b1fa0 name Existed_Raid, state configuring 00:13:29.259 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:29.520 [2024-07-15 17:25:40.645078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:29.520 [2024-07-15 17:25:40.646210] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:29.520 [2024-07-15 17:25:40.646234] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:29.520 [2024-07-15 17:25:40.646244] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:29.520 [2024-07-15 17:25:40.646250] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.520 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:29.781 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.781 "name": "Existed_Raid", 00:13:29.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:29.781 "strip_size_kb": 64, 00:13:29.781 "state": "configuring", 00:13:29.781 "raid_level": "concat", 00:13:29.781 "superblock": false, 00:13:29.781 "num_base_bdevs": 3, 00:13:29.781 "num_base_bdevs_discovered": 1, 00:13:29.781 "num_base_bdevs_operational": 3, 00:13:29.781 "base_bdevs_list": [ 00:13:29.781 { 00:13:29.781 "name": "BaseBdev1", 00:13:29.781 "uuid": "cb9717e2-66ac-4de7-a8fc-784e9f68c2a9", 00:13:29.781 "is_configured": true, 00:13:29.781 "data_offset": 0, 00:13:29.781 "data_size": 65536 00:13:29.781 }, 00:13:29.782 { 00:13:29.782 "name": "BaseBdev2", 00:13:29.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:29.782 "is_configured": false, 00:13:29.782 "data_offset": 0, 00:13:29.782 "data_size": 0 00:13:29.782 }, 00:13:29.782 { 00:13:29.782 "name": "BaseBdev3", 00:13:29.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:29.782 "is_configured": false, 00:13:29.782 "data_offset": 0, 00:13:29.805 "data_size": 0 00:13:29.805 } 00:13:29.805 ] 00:13:29.805 }' 00:13:29.805 17:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.805 17:25:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.377 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:30.377 [2024-07-15 17:25:41.592224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:30.377 BaseBdev2 00:13:30.377 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:30.377 17:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:30.377 17:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:30.377 17:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:30.377 17:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:30.377 17:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:30.377 17:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:30.637 17:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:30.897 [ 00:13:30.897 { 00:13:30.897 "name": "BaseBdev2", 00:13:30.897 "aliases": [ 00:13:30.897 "b5918aab-7bc2-4434-b7e1-49662f0a1282" 00:13:30.897 ], 00:13:30.897 "product_name": "Malloc disk", 00:13:30.897 "block_size": 512, 00:13:30.897 "num_blocks": 65536, 00:13:30.897 "uuid": "b5918aab-7bc2-4434-b7e1-49662f0a1282", 00:13:30.897 "assigned_rate_limits": { 00:13:30.897 "rw_ios_per_sec": 0, 00:13:30.897 "rw_mbytes_per_sec": 0, 00:13:30.897 "r_mbytes_per_sec": 0, 00:13:30.897 "w_mbytes_per_sec": 0 00:13:30.897 }, 00:13:30.897 "claimed": true, 00:13:30.897 "claim_type": "exclusive_write", 00:13:30.897 "zoned": false, 00:13:30.897 "supported_io_types": { 00:13:30.897 "read": true, 00:13:30.897 "write": true, 00:13:30.897 "unmap": true, 00:13:30.897 "flush": true, 00:13:30.897 "reset": true, 00:13:30.897 "nvme_admin": false, 00:13:30.897 "nvme_io": false, 00:13:30.897 "nvme_io_md": false, 00:13:30.897 "write_zeroes": true, 00:13:30.897 "zcopy": true, 00:13:30.897 "get_zone_info": false, 00:13:30.897 "zone_management": false, 00:13:30.897 "zone_append": false, 00:13:30.897 "compare": false, 00:13:30.897 "compare_and_write": false, 00:13:30.897 "abort": true, 00:13:30.897 "seek_hole": false, 00:13:30.897 "seek_data": false, 00:13:30.897 "copy": true, 00:13:30.897 "nvme_iov_md": false 00:13:30.897 }, 00:13:30.897 "memory_domains": [ 00:13:30.897 { 00:13:30.897 "dma_device_id": "system", 00:13:30.897 "dma_device_type": 1 00:13:30.897 }, 00:13:30.897 { 00:13:30.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.897 "dma_device_type": 2 00:13:30.897 } 00:13:30.897 ], 00:13:30.897 "driver_specific": {} 00:13:30.897 } 00:13:30.897 ] 00:13:30.897 17:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.898 17:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:30.898 17:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:30.898 "name": "Existed_Raid", 00:13:30.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:30.898 "strip_size_kb": 64, 00:13:30.898 "state": "configuring", 00:13:30.898 "raid_level": "concat", 00:13:30.898 "superblock": false, 00:13:30.898 "num_base_bdevs": 3, 00:13:30.898 "num_base_bdevs_discovered": 2, 00:13:30.898 "num_base_bdevs_operational": 3, 00:13:30.898 "base_bdevs_list": [ 00:13:30.898 { 00:13:30.898 "name": "BaseBdev1", 00:13:30.898 "uuid": "cb9717e2-66ac-4de7-a8fc-784e9f68c2a9", 00:13:30.898 "is_configured": true, 00:13:30.898 "data_offset": 0, 00:13:30.898 "data_size": 65536 00:13:30.898 }, 00:13:30.898 { 00:13:30.898 "name": "BaseBdev2", 00:13:30.898 "uuid": "b5918aab-7bc2-4434-b7e1-49662f0a1282", 00:13:30.898 "is_configured": true, 00:13:30.898 "data_offset": 0, 00:13:30.898 "data_size": 65536 00:13:30.898 }, 00:13:30.898 { 00:13:30.898 "name": "BaseBdev3", 00:13:30.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:30.898 "is_configured": false, 00:13:30.898 "data_offset": 0, 00:13:30.898 "data_size": 0 00:13:30.898 } 00:13:30.898 ] 00:13:30.898 }' 00:13:30.898 17:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:30.898 17:25:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.468 17:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:31.729 [2024-07-15 17:25:42.904332] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:31.729 [2024-07-15 17:25:42.904357] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25b2e90 00:13:31.729 [2024-07-15 17:25:42.904362] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:31.729 [2024-07-15 17:25:42.904501] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b2b60 00:13:31.729 [2024-07-15 17:25:42.904597] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25b2e90 00:13:31.729 [2024-07-15 17:25:42.904604] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25b2e90 00:13:31.729 [2024-07-15 17:25:42.904725] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:31.729 BaseBdev3 00:13:31.729 17:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:31.729 17:25:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:31.729 17:25:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:31.729 17:25:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:31.729 17:25:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:31.729 17:25:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:31.729 17:25:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:31.989 17:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:32.249 [ 00:13:32.249 { 00:13:32.249 "name": "BaseBdev3", 00:13:32.249 "aliases": [ 00:13:32.249 "845b4538-7494-4348-8911-8ff75c1a17b4" 00:13:32.249 ], 00:13:32.249 "product_name": "Malloc disk", 00:13:32.249 "block_size": 512, 00:13:32.249 "num_blocks": 65536, 00:13:32.249 "uuid": "845b4538-7494-4348-8911-8ff75c1a17b4", 00:13:32.249 "assigned_rate_limits": { 00:13:32.249 "rw_ios_per_sec": 0, 00:13:32.249 "rw_mbytes_per_sec": 0, 00:13:32.249 "r_mbytes_per_sec": 0, 00:13:32.249 "w_mbytes_per_sec": 0 00:13:32.249 }, 00:13:32.249 "claimed": true, 00:13:32.249 "claim_type": "exclusive_write", 00:13:32.249 "zoned": false, 00:13:32.249 "supported_io_types": { 00:13:32.249 "read": true, 00:13:32.249 "write": true, 00:13:32.249 "unmap": true, 00:13:32.249 "flush": true, 00:13:32.249 "reset": true, 00:13:32.249 "nvme_admin": false, 00:13:32.249 "nvme_io": false, 00:13:32.249 "nvme_io_md": false, 00:13:32.249 "write_zeroes": true, 00:13:32.249 "zcopy": true, 00:13:32.249 "get_zone_info": false, 00:13:32.249 "zone_management": false, 00:13:32.249 "zone_append": false, 00:13:32.249 "compare": false, 00:13:32.249 "compare_and_write": false, 00:13:32.249 "abort": true, 00:13:32.249 "seek_hole": false, 00:13:32.249 "seek_data": false, 00:13:32.249 "copy": true, 00:13:32.249 "nvme_iov_md": false 00:13:32.249 }, 00:13:32.249 "memory_domains": [ 00:13:32.249 { 00:13:32.249 "dma_device_id": "system", 00:13:32.249 "dma_device_type": 1 00:13:32.249 }, 00:13:32.249 { 00:13:32.249 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.249 "dma_device_type": 2 00:13:32.249 } 00:13:32.249 ], 00:13:32.249 "driver_specific": {} 00:13:32.249 } 00:13:32.249 ] 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:32.249 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:32.249 "name": "Existed_Raid", 00:13:32.249 "uuid": "d04cb146-64c3-44cb-bbc7-3be103afe7cf", 00:13:32.249 "strip_size_kb": 64, 00:13:32.249 "state": "online", 00:13:32.249 "raid_level": "concat", 00:13:32.249 "superblock": false, 00:13:32.249 "num_base_bdevs": 3, 00:13:32.249 "num_base_bdevs_discovered": 3, 00:13:32.249 "num_base_bdevs_operational": 3, 00:13:32.249 "base_bdevs_list": [ 00:13:32.249 { 00:13:32.249 "name": "BaseBdev1", 00:13:32.249 "uuid": "cb9717e2-66ac-4de7-a8fc-784e9f68c2a9", 00:13:32.249 "is_configured": true, 00:13:32.249 "data_offset": 0, 00:13:32.249 "data_size": 65536 00:13:32.249 }, 00:13:32.249 { 00:13:32.249 "name": "BaseBdev2", 00:13:32.250 "uuid": "b5918aab-7bc2-4434-b7e1-49662f0a1282", 00:13:32.250 "is_configured": true, 00:13:32.250 "data_offset": 0, 00:13:32.250 "data_size": 65536 00:13:32.250 }, 00:13:32.250 { 00:13:32.250 "name": "BaseBdev3", 00:13:32.250 "uuid": "845b4538-7494-4348-8911-8ff75c1a17b4", 00:13:32.250 "is_configured": true, 00:13:32.250 "data_offset": 0, 00:13:32.250 "data_size": 65536 00:13:32.250 } 00:13:32.250 ] 00:13:32.250 }' 00:13:32.250 17:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:32.250 17:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.818 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:32.818 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:32.818 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:32.818 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:32.818 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:32.818 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:32.818 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:32.819 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:33.079 [2024-07-15 17:25:44.239955] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:33.079 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:33.079 "name": "Existed_Raid", 00:13:33.079 "aliases": [ 00:13:33.079 "d04cb146-64c3-44cb-bbc7-3be103afe7cf" 00:13:33.079 ], 00:13:33.079 "product_name": "Raid Volume", 00:13:33.079 "block_size": 512, 00:13:33.079 "num_blocks": 196608, 00:13:33.079 "uuid": "d04cb146-64c3-44cb-bbc7-3be103afe7cf", 00:13:33.079 "assigned_rate_limits": { 00:13:33.079 "rw_ios_per_sec": 0, 00:13:33.079 "rw_mbytes_per_sec": 0, 00:13:33.079 "r_mbytes_per_sec": 0, 00:13:33.079 "w_mbytes_per_sec": 0 00:13:33.079 }, 00:13:33.079 "claimed": false, 00:13:33.079 "zoned": false, 00:13:33.079 "supported_io_types": { 00:13:33.079 "read": true, 00:13:33.079 "write": true, 00:13:33.079 "unmap": true, 00:13:33.079 "flush": true, 00:13:33.079 "reset": true, 00:13:33.079 "nvme_admin": false, 00:13:33.079 "nvme_io": false, 00:13:33.079 "nvme_io_md": false, 00:13:33.079 "write_zeroes": true, 00:13:33.079 "zcopy": false, 00:13:33.079 "get_zone_info": false, 00:13:33.079 "zone_management": false, 00:13:33.079 "zone_append": false, 00:13:33.079 "compare": false, 00:13:33.079 "compare_and_write": false, 00:13:33.079 "abort": false, 00:13:33.079 "seek_hole": false, 00:13:33.079 "seek_data": false, 00:13:33.079 "copy": false, 00:13:33.079 "nvme_iov_md": false 00:13:33.079 }, 00:13:33.079 "memory_domains": [ 00:13:33.079 { 00:13:33.079 "dma_device_id": "system", 00:13:33.079 "dma_device_type": 1 00:13:33.079 }, 00:13:33.079 { 00:13:33.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.079 "dma_device_type": 2 00:13:33.079 }, 00:13:33.079 { 00:13:33.079 "dma_device_id": "system", 00:13:33.079 "dma_device_type": 1 00:13:33.079 }, 00:13:33.079 { 00:13:33.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.079 "dma_device_type": 2 00:13:33.079 }, 00:13:33.079 { 00:13:33.079 "dma_device_id": "system", 00:13:33.079 "dma_device_type": 1 00:13:33.079 }, 00:13:33.079 { 00:13:33.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.079 "dma_device_type": 2 00:13:33.079 } 00:13:33.079 ], 00:13:33.079 "driver_specific": { 00:13:33.079 "raid": { 00:13:33.079 "uuid": "d04cb146-64c3-44cb-bbc7-3be103afe7cf", 00:13:33.079 "strip_size_kb": 64, 00:13:33.079 "state": "online", 00:13:33.079 "raid_level": "concat", 00:13:33.079 "superblock": false, 00:13:33.079 "num_base_bdevs": 3, 00:13:33.079 "num_base_bdevs_discovered": 3, 00:13:33.079 "num_base_bdevs_operational": 3, 00:13:33.079 "base_bdevs_list": [ 00:13:33.079 { 00:13:33.079 "name": "BaseBdev1", 00:13:33.079 "uuid": "cb9717e2-66ac-4de7-a8fc-784e9f68c2a9", 00:13:33.079 "is_configured": true, 00:13:33.079 "data_offset": 0, 00:13:33.079 "data_size": 65536 00:13:33.079 }, 00:13:33.079 { 00:13:33.079 "name": "BaseBdev2", 00:13:33.079 "uuid": "b5918aab-7bc2-4434-b7e1-49662f0a1282", 00:13:33.079 "is_configured": true, 00:13:33.079 "data_offset": 0, 00:13:33.079 "data_size": 65536 00:13:33.079 }, 00:13:33.079 { 00:13:33.079 "name": "BaseBdev3", 00:13:33.079 "uuid": "845b4538-7494-4348-8911-8ff75c1a17b4", 00:13:33.079 "is_configured": true, 00:13:33.079 "data_offset": 0, 00:13:33.079 "data_size": 65536 00:13:33.079 } 00:13:33.079 ] 00:13:33.079 } 00:13:33.079 } 00:13:33.079 }' 00:13:33.079 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:33.079 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:33.079 BaseBdev2 00:13:33.079 BaseBdev3' 00:13:33.079 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:33.079 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:33.079 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:33.339 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:33.339 "name": "BaseBdev1", 00:13:33.339 "aliases": [ 00:13:33.339 "cb9717e2-66ac-4de7-a8fc-784e9f68c2a9" 00:13:33.339 ], 00:13:33.339 "product_name": "Malloc disk", 00:13:33.339 "block_size": 512, 00:13:33.339 "num_blocks": 65536, 00:13:33.339 "uuid": "cb9717e2-66ac-4de7-a8fc-784e9f68c2a9", 00:13:33.339 "assigned_rate_limits": { 00:13:33.339 "rw_ios_per_sec": 0, 00:13:33.339 "rw_mbytes_per_sec": 0, 00:13:33.339 "r_mbytes_per_sec": 0, 00:13:33.339 "w_mbytes_per_sec": 0 00:13:33.339 }, 00:13:33.339 "claimed": true, 00:13:33.339 "claim_type": "exclusive_write", 00:13:33.339 "zoned": false, 00:13:33.339 "supported_io_types": { 00:13:33.339 "read": true, 00:13:33.339 "write": true, 00:13:33.339 "unmap": true, 00:13:33.339 "flush": true, 00:13:33.339 "reset": true, 00:13:33.339 "nvme_admin": false, 00:13:33.339 "nvme_io": false, 00:13:33.339 "nvme_io_md": false, 00:13:33.339 "write_zeroes": true, 00:13:33.339 "zcopy": true, 00:13:33.339 "get_zone_info": false, 00:13:33.339 "zone_management": false, 00:13:33.339 "zone_append": false, 00:13:33.339 "compare": false, 00:13:33.339 "compare_and_write": false, 00:13:33.339 "abort": true, 00:13:33.339 "seek_hole": false, 00:13:33.339 "seek_data": false, 00:13:33.339 "copy": true, 00:13:33.339 "nvme_iov_md": false 00:13:33.339 }, 00:13:33.339 "memory_domains": [ 00:13:33.339 { 00:13:33.339 "dma_device_id": "system", 00:13:33.339 "dma_device_type": 1 00:13:33.339 }, 00:13:33.339 { 00:13:33.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.339 "dma_device_type": 2 00:13:33.339 } 00:13:33.339 ], 00:13:33.339 "driver_specific": {} 00:13:33.339 }' 00:13:33.339 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.339 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.339 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:33.339 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:33.599 17:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:33.859 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:33.859 "name": "BaseBdev2", 00:13:33.859 "aliases": [ 00:13:33.859 "b5918aab-7bc2-4434-b7e1-49662f0a1282" 00:13:33.859 ], 00:13:33.859 "product_name": "Malloc disk", 00:13:33.859 "block_size": 512, 00:13:33.859 "num_blocks": 65536, 00:13:33.859 "uuid": "b5918aab-7bc2-4434-b7e1-49662f0a1282", 00:13:33.859 "assigned_rate_limits": { 00:13:33.859 "rw_ios_per_sec": 0, 00:13:33.859 "rw_mbytes_per_sec": 0, 00:13:33.859 "r_mbytes_per_sec": 0, 00:13:33.859 "w_mbytes_per_sec": 0 00:13:33.859 }, 00:13:33.859 "claimed": true, 00:13:33.859 "claim_type": "exclusive_write", 00:13:33.859 "zoned": false, 00:13:33.859 "supported_io_types": { 00:13:33.859 "read": true, 00:13:33.859 "write": true, 00:13:33.859 "unmap": true, 00:13:33.859 "flush": true, 00:13:33.859 "reset": true, 00:13:33.859 "nvme_admin": false, 00:13:33.859 "nvme_io": false, 00:13:33.859 "nvme_io_md": false, 00:13:33.859 "write_zeroes": true, 00:13:33.859 "zcopy": true, 00:13:33.859 "get_zone_info": false, 00:13:33.859 "zone_management": false, 00:13:33.859 "zone_append": false, 00:13:33.859 "compare": false, 00:13:33.859 "compare_and_write": false, 00:13:33.859 "abort": true, 00:13:33.859 "seek_hole": false, 00:13:33.859 "seek_data": false, 00:13:33.859 "copy": true, 00:13:33.859 "nvme_iov_md": false 00:13:33.859 }, 00:13:33.859 "memory_domains": [ 00:13:33.859 { 00:13:33.859 "dma_device_id": "system", 00:13:33.859 "dma_device_type": 1 00:13:33.859 }, 00:13:33.859 { 00:13:33.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.859 "dma_device_type": 2 00:13:33.859 } 00:13:33.859 ], 00:13:33.859 "driver_specific": {} 00:13:33.859 }' 00:13:33.859 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.859 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.859 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:33.859 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:34.119 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:34.379 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:34.379 "name": "BaseBdev3", 00:13:34.379 "aliases": [ 00:13:34.379 "845b4538-7494-4348-8911-8ff75c1a17b4" 00:13:34.379 ], 00:13:34.379 "product_name": "Malloc disk", 00:13:34.379 "block_size": 512, 00:13:34.379 "num_blocks": 65536, 00:13:34.379 "uuid": "845b4538-7494-4348-8911-8ff75c1a17b4", 00:13:34.379 "assigned_rate_limits": { 00:13:34.379 "rw_ios_per_sec": 0, 00:13:34.379 "rw_mbytes_per_sec": 0, 00:13:34.379 "r_mbytes_per_sec": 0, 00:13:34.379 "w_mbytes_per_sec": 0 00:13:34.379 }, 00:13:34.379 "claimed": true, 00:13:34.379 "claim_type": "exclusive_write", 00:13:34.379 "zoned": false, 00:13:34.379 "supported_io_types": { 00:13:34.379 "read": true, 00:13:34.379 "write": true, 00:13:34.379 "unmap": true, 00:13:34.379 "flush": true, 00:13:34.379 "reset": true, 00:13:34.379 "nvme_admin": false, 00:13:34.379 "nvme_io": false, 00:13:34.379 "nvme_io_md": false, 00:13:34.379 "write_zeroes": true, 00:13:34.379 "zcopy": true, 00:13:34.379 "get_zone_info": false, 00:13:34.379 "zone_management": false, 00:13:34.379 "zone_append": false, 00:13:34.379 "compare": false, 00:13:34.379 "compare_and_write": false, 00:13:34.379 "abort": true, 00:13:34.379 "seek_hole": false, 00:13:34.379 "seek_data": false, 00:13:34.379 "copy": true, 00:13:34.379 "nvme_iov_md": false 00:13:34.379 }, 00:13:34.379 "memory_domains": [ 00:13:34.379 { 00:13:34.379 "dma_device_id": "system", 00:13:34.379 "dma_device_type": 1 00:13:34.379 }, 00:13:34.379 { 00:13:34.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:34.379 "dma_device_type": 2 00:13:34.379 } 00:13:34.379 ], 00:13:34.379 "driver_specific": {} 00:13:34.379 }' 00:13:34.379 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:34.379 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:34.379 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:34.379 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:34.640 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:34.640 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:34.640 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:34.640 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:34.640 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:34.640 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:34.640 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:34.900 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:34.900 17:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:34.900 [2024-07-15 17:25:46.116523] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:34.900 [2024-07-15 17:25:46.116539] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:34.900 [2024-07-15 17:25:46.116567] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.900 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:35.160 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.160 "name": "Existed_Raid", 00:13:35.160 "uuid": "d04cb146-64c3-44cb-bbc7-3be103afe7cf", 00:13:35.160 "strip_size_kb": 64, 00:13:35.160 "state": "offline", 00:13:35.160 "raid_level": "concat", 00:13:35.160 "superblock": false, 00:13:35.160 "num_base_bdevs": 3, 00:13:35.160 "num_base_bdevs_discovered": 2, 00:13:35.160 "num_base_bdevs_operational": 2, 00:13:35.160 "base_bdevs_list": [ 00:13:35.160 { 00:13:35.160 "name": null, 00:13:35.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.160 "is_configured": false, 00:13:35.160 "data_offset": 0, 00:13:35.160 "data_size": 65536 00:13:35.160 }, 00:13:35.160 { 00:13:35.160 "name": "BaseBdev2", 00:13:35.160 "uuid": "b5918aab-7bc2-4434-b7e1-49662f0a1282", 00:13:35.160 "is_configured": true, 00:13:35.160 "data_offset": 0, 00:13:35.160 "data_size": 65536 00:13:35.160 }, 00:13:35.160 { 00:13:35.160 "name": "BaseBdev3", 00:13:35.160 "uuid": "845b4538-7494-4348-8911-8ff75c1a17b4", 00:13:35.160 "is_configured": true, 00:13:35.160 "data_offset": 0, 00:13:35.160 "data_size": 65536 00:13:35.160 } 00:13:35.160 ] 00:13:35.160 }' 00:13:35.160 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.160 17:25:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.728 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:35.729 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:35.729 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.729 17:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:35.989 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:35.989 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:35.989 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:35.989 [2024-07-15 17:25:47.251401] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:35.989 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:35.989 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:36.249 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.249 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:36.249 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:36.249 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:36.249 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:36.509 [2024-07-15 17:25:47.670023] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:36.509 [2024-07-15 17:25:47.670049] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b2e90 name Existed_Raid, state offline 00:13:36.509 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:36.509 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:36.509 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.509 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:36.769 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:36.769 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:36.769 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:36.769 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:36.769 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:36.769 17:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:37.029 BaseBdev2 00:13:37.029 17:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:37.029 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:37.029 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:37.029 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:37.029 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:37.029 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:37.029 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:37.029 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:37.335 [ 00:13:37.335 { 00:13:37.335 "name": "BaseBdev2", 00:13:37.335 "aliases": [ 00:13:37.335 "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c" 00:13:37.335 ], 00:13:37.335 "product_name": "Malloc disk", 00:13:37.335 "block_size": 512, 00:13:37.335 "num_blocks": 65536, 00:13:37.335 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:37.335 "assigned_rate_limits": { 00:13:37.335 "rw_ios_per_sec": 0, 00:13:37.335 "rw_mbytes_per_sec": 0, 00:13:37.335 "r_mbytes_per_sec": 0, 00:13:37.335 "w_mbytes_per_sec": 0 00:13:37.335 }, 00:13:37.335 "claimed": false, 00:13:37.335 "zoned": false, 00:13:37.335 "supported_io_types": { 00:13:37.335 "read": true, 00:13:37.335 "write": true, 00:13:37.335 "unmap": true, 00:13:37.335 "flush": true, 00:13:37.335 "reset": true, 00:13:37.335 "nvme_admin": false, 00:13:37.335 "nvme_io": false, 00:13:37.335 "nvme_io_md": false, 00:13:37.335 "write_zeroes": true, 00:13:37.335 "zcopy": true, 00:13:37.335 "get_zone_info": false, 00:13:37.335 "zone_management": false, 00:13:37.335 "zone_append": false, 00:13:37.335 "compare": false, 00:13:37.335 "compare_and_write": false, 00:13:37.335 "abort": true, 00:13:37.335 "seek_hole": false, 00:13:37.335 "seek_data": false, 00:13:37.335 "copy": true, 00:13:37.335 "nvme_iov_md": false 00:13:37.335 }, 00:13:37.335 "memory_domains": [ 00:13:37.335 { 00:13:37.335 "dma_device_id": "system", 00:13:37.335 "dma_device_type": 1 00:13:37.335 }, 00:13:37.335 { 00:13:37.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.335 "dma_device_type": 2 00:13:37.335 } 00:13:37.335 ], 00:13:37.335 "driver_specific": {} 00:13:37.335 } 00:13:37.335 ] 00:13:37.335 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:37.335 17:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:37.335 17:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:37.335 17:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:37.628 BaseBdev3 00:13:37.628 17:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:37.628 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:37.628 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:37.628 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:37.628 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:37.628 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:37.628 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:37.628 17:25:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:37.887 [ 00:13:37.887 { 00:13:37.887 "name": "BaseBdev3", 00:13:37.887 "aliases": [ 00:13:37.887 "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2" 00:13:37.887 ], 00:13:37.887 "product_name": "Malloc disk", 00:13:37.887 "block_size": 512, 00:13:37.887 "num_blocks": 65536, 00:13:37.887 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:37.887 "assigned_rate_limits": { 00:13:37.887 "rw_ios_per_sec": 0, 00:13:37.887 "rw_mbytes_per_sec": 0, 00:13:37.887 "r_mbytes_per_sec": 0, 00:13:37.887 "w_mbytes_per_sec": 0 00:13:37.887 }, 00:13:37.887 "claimed": false, 00:13:37.887 "zoned": false, 00:13:37.887 "supported_io_types": { 00:13:37.887 "read": true, 00:13:37.888 "write": true, 00:13:37.888 "unmap": true, 00:13:37.888 "flush": true, 00:13:37.888 "reset": true, 00:13:37.888 "nvme_admin": false, 00:13:37.888 "nvme_io": false, 00:13:37.888 "nvme_io_md": false, 00:13:37.888 "write_zeroes": true, 00:13:37.888 "zcopy": true, 00:13:37.888 "get_zone_info": false, 00:13:37.888 "zone_management": false, 00:13:37.888 "zone_append": false, 00:13:37.888 "compare": false, 00:13:37.888 "compare_and_write": false, 00:13:37.888 "abort": true, 00:13:37.888 "seek_hole": false, 00:13:37.888 "seek_data": false, 00:13:37.888 "copy": true, 00:13:37.888 "nvme_iov_md": false 00:13:37.888 }, 00:13:37.888 "memory_domains": [ 00:13:37.888 { 00:13:37.888 "dma_device_id": "system", 00:13:37.888 "dma_device_type": 1 00:13:37.888 }, 00:13:37.888 { 00:13:37.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.888 "dma_device_type": 2 00:13:37.888 } 00:13:37.888 ], 00:13:37.888 "driver_specific": {} 00:13:37.888 } 00:13:37.888 ] 00:13:37.888 17:25:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:37.888 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:37.888 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:37.888 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:38.148 [2024-07-15 17:25:49.209354] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:38.148 [2024-07-15 17:25:49.209380] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:38.148 [2024-07-15 17:25:49.209392] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:38.148 [2024-07-15 17:25:49.210421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.148 "name": "Existed_Raid", 00:13:38.148 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.148 "strip_size_kb": 64, 00:13:38.148 "state": "configuring", 00:13:38.148 "raid_level": "concat", 00:13:38.148 "superblock": false, 00:13:38.148 "num_base_bdevs": 3, 00:13:38.148 "num_base_bdevs_discovered": 2, 00:13:38.148 "num_base_bdevs_operational": 3, 00:13:38.148 "base_bdevs_list": [ 00:13:38.148 { 00:13:38.148 "name": "BaseBdev1", 00:13:38.148 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.148 "is_configured": false, 00:13:38.148 "data_offset": 0, 00:13:38.148 "data_size": 0 00:13:38.148 }, 00:13:38.148 { 00:13:38.148 "name": "BaseBdev2", 00:13:38.148 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:38.148 "is_configured": true, 00:13:38.148 "data_offset": 0, 00:13:38.148 "data_size": 65536 00:13:38.148 }, 00:13:38.148 { 00:13:38.148 "name": "BaseBdev3", 00:13:38.148 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:38.148 "is_configured": true, 00:13:38.148 "data_offset": 0, 00:13:38.148 "data_size": 65536 00:13:38.148 } 00:13:38.148 ] 00:13:38.148 }' 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.148 17:25:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.718 17:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:38.978 [2024-07-15 17:25:50.155741] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:38.978 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:38.978 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:38.978 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:38.978 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:38.978 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:38.979 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:38.979 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.979 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.979 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.979 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.979 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.979 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:39.239 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:39.239 "name": "Existed_Raid", 00:13:39.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:39.239 "strip_size_kb": 64, 00:13:39.239 "state": "configuring", 00:13:39.239 "raid_level": "concat", 00:13:39.239 "superblock": false, 00:13:39.239 "num_base_bdevs": 3, 00:13:39.239 "num_base_bdevs_discovered": 1, 00:13:39.239 "num_base_bdevs_operational": 3, 00:13:39.239 "base_bdevs_list": [ 00:13:39.239 { 00:13:39.239 "name": "BaseBdev1", 00:13:39.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:39.239 "is_configured": false, 00:13:39.239 "data_offset": 0, 00:13:39.239 "data_size": 0 00:13:39.239 }, 00:13:39.239 { 00:13:39.239 "name": null, 00:13:39.239 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:39.239 "is_configured": false, 00:13:39.239 "data_offset": 0, 00:13:39.239 "data_size": 65536 00:13:39.239 }, 00:13:39.239 { 00:13:39.239 "name": "BaseBdev3", 00:13:39.239 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:39.239 "is_configured": true, 00:13:39.239 "data_offset": 0, 00:13:39.239 "data_size": 65536 00:13:39.239 } 00:13:39.239 ] 00:13:39.239 }' 00:13:39.239 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:39.239 17:25:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.809 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.809 17:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:39.809 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:40.067 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:40.067 [2024-07-15 17:25:51.287507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:40.067 BaseBdev1 00:13:40.067 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:40.067 17:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:40.067 17:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:40.067 17:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:40.067 17:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:40.067 17:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:40.067 17:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:40.326 17:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:40.586 [ 00:13:40.586 { 00:13:40.586 "name": "BaseBdev1", 00:13:40.586 "aliases": [ 00:13:40.586 "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21" 00:13:40.586 ], 00:13:40.586 "product_name": "Malloc disk", 00:13:40.586 "block_size": 512, 00:13:40.586 "num_blocks": 65536, 00:13:40.586 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:40.586 "assigned_rate_limits": { 00:13:40.586 "rw_ios_per_sec": 0, 00:13:40.586 "rw_mbytes_per_sec": 0, 00:13:40.586 "r_mbytes_per_sec": 0, 00:13:40.586 "w_mbytes_per_sec": 0 00:13:40.586 }, 00:13:40.586 "claimed": true, 00:13:40.586 "claim_type": "exclusive_write", 00:13:40.586 "zoned": false, 00:13:40.586 "supported_io_types": { 00:13:40.586 "read": true, 00:13:40.586 "write": true, 00:13:40.586 "unmap": true, 00:13:40.586 "flush": true, 00:13:40.586 "reset": true, 00:13:40.586 "nvme_admin": false, 00:13:40.586 "nvme_io": false, 00:13:40.586 "nvme_io_md": false, 00:13:40.586 "write_zeroes": true, 00:13:40.586 "zcopy": true, 00:13:40.586 "get_zone_info": false, 00:13:40.586 "zone_management": false, 00:13:40.586 "zone_append": false, 00:13:40.586 "compare": false, 00:13:40.586 "compare_and_write": false, 00:13:40.586 "abort": true, 00:13:40.586 "seek_hole": false, 00:13:40.586 "seek_data": false, 00:13:40.586 "copy": true, 00:13:40.586 "nvme_iov_md": false 00:13:40.586 }, 00:13:40.586 "memory_domains": [ 00:13:40.586 { 00:13:40.586 "dma_device_id": "system", 00:13:40.586 "dma_device_type": 1 00:13:40.586 }, 00:13:40.586 { 00:13:40.586 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.586 "dma_device_type": 2 00:13:40.586 } 00:13:40.586 ], 00:13:40.586 "driver_specific": {} 00:13:40.586 } 00:13:40.586 ] 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.586 "name": "Existed_Raid", 00:13:40.586 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.586 "strip_size_kb": 64, 00:13:40.586 "state": "configuring", 00:13:40.586 "raid_level": "concat", 00:13:40.586 "superblock": false, 00:13:40.586 "num_base_bdevs": 3, 00:13:40.586 "num_base_bdevs_discovered": 2, 00:13:40.586 "num_base_bdevs_operational": 3, 00:13:40.586 "base_bdevs_list": [ 00:13:40.586 { 00:13:40.586 "name": "BaseBdev1", 00:13:40.586 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:40.586 "is_configured": true, 00:13:40.586 "data_offset": 0, 00:13:40.586 "data_size": 65536 00:13:40.586 }, 00:13:40.586 { 00:13:40.586 "name": null, 00:13:40.586 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:40.586 "is_configured": false, 00:13:40.586 "data_offset": 0, 00:13:40.586 "data_size": 65536 00:13:40.586 }, 00:13:40.586 { 00:13:40.586 "name": "BaseBdev3", 00:13:40.586 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:40.586 "is_configured": true, 00:13:40.586 "data_offset": 0, 00:13:40.586 "data_size": 65536 00:13:40.586 } 00:13:40.586 ] 00:13:40.586 }' 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.586 17:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.154 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.154 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:41.413 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:41.413 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:41.672 [2024-07-15 17:25:52.779315] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.672 17:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:41.931 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.931 "name": "Existed_Raid", 00:13:41.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.931 "strip_size_kb": 64, 00:13:41.931 "state": "configuring", 00:13:41.931 "raid_level": "concat", 00:13:41.931 "superblock": false, 00:13:41.931 "num_base_bdevs": 3, 00:13:41.931 "num_base_bdevs_discovered": 1, 00:13:41.931 "num_base_bdevs_operational": 3, 00:13:41.931 "base_bdevs_list": [ 00:13:41.931 { 00:13:41.931 "name": "BaseBdev1", 00:13:41.931 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:41.931 "is_configured": true, 00:13:41.931 "data_offset": 0, 00:13:41.931 "data_size": 65536 00:13:41.931 }, 00:13:41.931 { 00:13:41.931 "name": null, 00:13:41.931 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:41.931 "is_configured": false, 00:13:41.931 "data_offset": 0, 00:13:41.931 "data_size": 65536 00:13:41.931 }, 00:13:41.931 { 00:13:41.931 "name": null, 00:13:41.931 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:41.931 "is_configured": false, 00:13:41.931 "data_offset": 0, 00:13:41.931 "data_size": 65536 00:13:41.931 } 00:13:41.931 ] 00:13:41.931 }' 00:13:41.931 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.931 17:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.500 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.500 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:42.500 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:42.500 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:42.760 [2024-07-15 17:25:53.882177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.760 17:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:43.021 17:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:43.021 "name": "Existed_Raid", 00:13:43.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.021 "strip_size_kb": 64, 00:13:43.021 "state": "configuring", 00:13:43.021 "raid_level": "concat", 00:13:43.021 "superblock": false, 00:13:43.021 "num_base_bdevs": 3, 00:13:43.021 "num_base_bdevs_discovered": 2, 00:13:43.021 "num_base_bdevs_operational": 3, 00:13:43.021 "base_bdevs_list": [ 00:13:43.021 { 00:13:43.021 "name": "BaseBdev1", 00:13:43.021 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:43.021 "is_configured": true, 00:13:43.021 "data_offset": 0, 00:13:43.021 "data_size": 65536 00:13:43.021 }, 00:13:43.021 { 00:13:43.021 "name": null, 00:13:43.021 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:43.021 "is_configured": false, 00:13:43.021 "data_offset": 0, 00:13:43.021 "data_size": 65536 00:13:43.021 }, 00:13:43.021 { 00:13:43.021 "name": "BaseBdev3", 00:13:43.021 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:43.021 "is_configured": true, 00:13:43.021 "data_offset": 0, 00:13:43.021 "data_size": 65536 00:13:43.021 } 00:13:43.021 ] 00:13:43.021 }' 00:13:43.021 17:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:43.021 17:25:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.592 17:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.592 17:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:43.592 17:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:43.592 17:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:43.852 [2024-07-15 17:25:54.997000] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.852 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.111 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.111 "name": "Existed_Raid", 00:13:44.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.111 "strip_size_kb": 64, 00:13:44.111 "state": "configuring", 00:13:44.111 "raid_level": "concat", 00:13:44.111 "superblock": false, 00:13:44.111 "num_base_bdevs": 3, 00:13:44.111 "num_base_bdevs_discovered": 1, 00:13:44.111 "num_base_bdevs_operational": 3, 00:13:44.111 "base_bdevs_list": [ 00:13:44.111 { 00:13:44.111 "name": null, 00:13:44.111 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:44.111 "is_configured": false, 00:13:44.111 "data_offset": 0, 00:13:44.111 "data_size": 65536 00:13:44.111 }, 00:13:44.111 { 00:13:44.111 "name": null, 00:13:44.111 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:44.111 "is_configured": false, 00:13:44.111 "data_offset": 0, 00:13:44.111 "data_size": 65536 00:13:44.111 }, 00:13:44.111 { 00:13:44.111 "name": "BaseBdev3", 00:13:44.111 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:44.111 "is_configured": true, 00:13:44.111 "data_offset": 0, 00:13:44.111 "data_size": 65536 00:13:44.111 } 00:13:44.111 ] 00:13:44.111 }' 00:13:44.111 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.111 17:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.679 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.679 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:44.679 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:44.679 17:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:44.939 [2024-07-15 17:25:56.137575] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.939 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.940 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.200 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.200 "name": "Existed_Raid", 00:13:45.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.200 "strip_size_kb": 64, 00:13:45.200 "state": "configuring", 00:13:45.200 "raid_level": "concat", 00:13:45.200 "superblock": false, 00:13:45.200 "num_base_bdevs": 3, 00:13:45.200 "num_base_bdevs_discovered": 2, 00:13:45.200 "num_base_bdevs_operational": 3, 00:13:45.200 "base_bdevs_list": [ 00:13:45.200 { 00:13:45.200 "name": null, 00:13:45.200 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:45.200 "is_configured": false, 00:13:45.200 "data_offset": 0, 00:13:45.200 "data_size": 65536 00:13:45.200 }, 00:13:45.200 { 00:13:45.200 "name": "BaseBdev2", 00:13:45.200 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:45.200 "is_configured": true, 00:13:45.200 "data_offset": 0, 00:13:45.200 "data_size": 65536 00:13:45.200 }, 00:13:45.200 { 00:13:45.200 "name": "BaseBdev3", 00:13:45.200 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:45.200 "is_configured": true, 00:13:45.200 "data_offset": 0, 00:13:45.200 "data_size": 65536 00:13:45.200 } 00:13:45.200 ] 00:13:45.200 }' 00:13:45.200 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.200 17:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.771 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.771 17:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:46.146 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:46.146 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.146 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:46.146 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f1d5c000-f1ff-4576-ae34-a4c8e5de1f21 00:13:46.406 [2024-07-15 17:25:57.465803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:46.406 [2024-07-15 17:25:57.465827] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25b1250 00:13:46.406 [2024-07-15 17:25:57.465832] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:46.406 [2024-07-15 17:25:57.465981] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b26a0 00:13:46.406 [2024-07-15 17:25:57.466069] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25b1250 00:13:46.406 [2024-07-15 17:25:57.466074] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25b1250 00:13:46.406 [2024-07-15 17:25:57.466191] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:46.406 NewBaseBdev 00:13:46.406 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:46.406 17:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:46.406 17:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:46.406 17:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:46.406 17:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:46.406 17:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:46.406 17:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:46.406 17:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:46.666 [ 00:13:46.666 { 00:13:46.666 "name": "NewBaseBdev", 00:13:46.666 "aliases": [ 00:13:46.666 "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21" 00:13:46.666 ], 00:13:46.666 "product_name": "Malloc disk", 00:13:46.666 "block_size": 512, 00:13:46.666 "num_blocks": 65536, 00:13:46.666 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:46.666 "assigned_rate_limits": { 00:13:46.666 "rw_ios_per_sec": 0, 00:13:46.667 "rw_mbytes_per_sec": 0, 00:13:46.667 "r_mbytes_per_sec": 0, 00:13:46.667 "w_mbytes_per_sec": 0 00:13:46.667 }, 00:13:46.667 "claimed": true, 00:13:46.667 "claim_type": "exclusive_write", 00:13:46.667 "zoned": false, 00:13:46.667 "supported_io_types": { 00:13:46.667 "read": true, 00:13:46.667 "write": true, 00:13:46.667 "unmap": true, 00:13:46.667 "flush": true, 00:13:46.667 "reset": true, 00:13:46.667 "nvme_admin": false, 00:13:46.667 "nvme_io": false, 00:13:46.667 "nvme_io_md": false, 00:13:46.667 "write_zeroes": true, 00:13:46.667 "zcopy": true, 00:13:46.667 "get_zone_info": false, 00:13:46.667 "zone_management": false, 00:13:46.667 "zone_append": false, 00:13:46.667 "compare": false, 00:13:46.667 "compare_and_write": false, 00:13:46.667 "abort": true, 00:13:46.667 "seek_hole": false, 00:13:46.667 "seek_data": false, 00:13:46.667 "copy": true, 00:13:46.667 "nvme_iov_md": false 00:13:46.667 }, 00:13:46.667 "memory_domains": [ 00:13:46.667 { 00:13:46.667 "dma_device_id": "system", 00:13:46.667 "dma_device_type": 1 00:13:46.667 }, 00:13:46.667 { 00:13:46.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.667 "dma_device_type": 2 00:13:46.667 } 00:13:46.667 ], 00:13:46.667 "driver_specific": {} 00:13:46.667 } 00:13:46.667 ] 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.667 17:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.926 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.926 "name": "Existed_Raid", 00:13:46.926 "uuid": "0e48fd65-e9ad-4c5c-90c9-896ecce8efde", 00:13:46.926 "strip_size_kb": 64, 00:13:46.926 "state": "online", 00:13:46.926 "raid_level": "concat", 00:13:46.926 "superblock": false, 00:13:46.926 "num_base_bdevs": 3, 00:13:46.926 "num_base_bdevs_discovered": 3, 00:13:46.926 "num_base_bdevs_operational": 3, 00:13:46.926 "base_bdevs_list": [ 00:13:46.926 { 00:13:46.926 "name": "NewBaseBdev", 00:13:46.926 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:46.926 "is_configured": true, 00:13:46.926 "data_offset": 0, 00:13:46.926 "data_size": 65536 00:13:46.926 }, 00:13:46.926 { 00:13:46.926 "name": "BaseBdev2", 00:13:46.926 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:46.926 "is_configured": true, 00:13:46.926 "data_offset": 0, 00:13:46.926 "data_size": 65536 00:13:46.926 }, 00:13:46.926 { 00:13:46.926 "name": "BaseBdev3", 00:13:46.926 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:46.926 "is_configured": true, 00:13:46.926 "data_offset": 0, 00:13:46.926 "data_size": 65536 00:13:46.926 } 00:13:46.926 ] 00:13:46.926 }' 00:13:46.926 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.926 17:25:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:47.500 [2024-07-15 17:25:58.761318] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:47.500 "name": "Existed_Raid", 00:13:47.500 "aliases": [ 00:13:47.500 "0e48fd65-e9ad-4c5c-90c9-896ecce8efde" 00:13:47.500 ], 00:13:47.500 "product_name": "Raid Volume", 00:13:47.500 "block_size": 512, 00:13:47.500 "num_blocks": 196608, 00:13:47.500 "uuid": "0e48fd65-e9ad-4c5c-90c9-896ecce8efde", 00:13:47.500 "assigned_rate_limits": { 00:13:47.500 "rw_ios_per_sec": 0, 00:13:47.500 "rw_mbytes_per_sec": 0, 00:13:47.500 "r_mbytes_per_sec": 0, 00:13:47.500 "w_mbytes_per_sec": 0 00:13:47.500 }, 00:13:47.500 "claimed": false, 00:13:47.500 "zoned": false, 00:13:47.500 "supported_io_types": { 00:13:47.500 "read": true, 00:13:47.500 "write": true, 00:13:47.500 "unmap": true, 00:13:47.500 "flush": true, 00:13:47.500 "reset": true, 00:13:47.500 "nvme_admin": false, 00:13:47.500 "nvme_io": false, 00:13:47.500 "nvme_io_md": false, 00:13:47.500 "write_zeroes": true, 00:13:47.500 "zcopy": false, 00:13:47.500 "get_zone_info": false, 00:13:47.500 "zone_management": false, 00:13:47.500 "zone_append": false, 00:13:47.500 "compare": false, 00:13:47.500 "compare_and_write": false, 00:13:47.500 "abort": false, 00:13:47.500 "seek_hole": false, 00:13:47.500 "seek_data": false, 00:13:47.500 "copy": false, 00:13:47.500 "nvme_iov_md": false 00:13:47.500 }, 00:13:47.500 "memory_domains": [ 00:13:47.500 { 00:13:47.500 "dma_device_id": "system", 00:13:47.500 "dma_device_type": 1 00:13:47.500 }, 00:13:47.500 { 00:13:47.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.500 "dma_device_type": 2 00:13:47.500 }, 00:13:47.500 { 00:13:47.500 "dma_device_id": "system", 00:13:47.500 "dma_device_type": 1 00:13:47.500 }, 00:13:47.500 { 00:13:47.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.500 "dma_device_type": 2 00:13:47.500 }, 00:13:47.500 { 00:13:47.500 "dma_device_id": "system", 00:13:47.500 "dma_device_type": 1 00:13:47.500 }, 00:13:47.500 { 00:13:47.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.500 "dma_device_type": 2 00:13:47.500 } 00:13:47.500 ], 00:13:47.500 "driver_specific": { 00:13:47.500 "raid": { 00:13:47.500 "uuid": "0e48fd65-e9ad-4c5c-90c9-896ecce8efde", 00:13:47.500 "strip_size_kb": 64, 00:13:47.500 "state": "online", 00:13:47.500 "raid_level": "concat", 00:13:47.500 "superblock": false, 00:13:47.500 "num_base_bdevs": 3, 00:13:47.500 "num_base_bdevs_discovered": 3, 00:13:47.500 "num_base_bdevs_operational": 3, 00:13:47.500 "base_bdevs_list": [ 00:13:47.500 { 00:13:47.500 "name": "NewBaseBdev", 00:13:47.500 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:47.500 "is_configured": true, 00:13:47.500 "data_offset": 0, 00:13:47.500 "data_size": 65536 00:13:47.500 }, 00:13:47.500 { 00:13:47.500 "name": "BaseBdev2", 00:13:47.500 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:47.500 "is_configured": true, 00:13:47.500 "data_offset": 0, 00:13:47.500 "data_size": 65536 00:13:47.500 }, 00:13:47.500 { 00:13:47.500 "name": "BaseBdev3", 00:13:47.500 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:47.500 "is_configured": true, 00:13:47.500 "data_offset": 0, 00:13:47.500 "data_size": 65536 00:13:47.500 } 00:13:47.500 ] 00:13:47.500 } 00:13:47.500 } 00:13:47.500 }' 00:13:47.500 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:47.760 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:47.760 BaseBdev2 00:13:47.760 BaseBdev3' 00:13:47.760 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:47.760 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:47.760 17:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:47.760 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:47.760 "name": "NewBaseBdev", 00:13:47.760 "aliases": [ 00:13:47.760 "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21" 00:13:47.760 ], 00:13:47.760 "product_name": "Malloc disk", 00:13:47.760 "block_size": 512, 00:13:47.760 "num_blocks": 65536, 00:13:47.760 "uuid": "f1d5c000-f1ff-4576-ae34-a4c8e5de1f21", 00:13:47.760 "assigned_rate_limits": { 00:13:47.760 "rw_ios_per_sec": 0, 00:13:47.760 "rw_mbytes_per_sec": 0, 00:13:47.760 "r_mbytes_per_sec": 0, 00:13:47.760 "w_mbytes_per_sec": 0 00:13:47.760 }, 00:13:47.760 "claimed": true, 00:13:47.760 "claim_type": "exclusive_write", 00:13:47.760 "zoned": false, 00:13:47.760 "supported_io_types": { 00:13:47.760 "read": true, 00:13:47.760 "write": true, 00:13:47.760 "unmap": true, 00:13:47.760 "flush": true, 00:13:47.760 "reset": true, 00:13:47.760 "nvme_admin": false, 00:13:47.760 "nvme_io": false, 00:13:47.760 "nvme_io_md": false, 00:13:47.760 "write_zeroes": true, 00:13:47.760 "zcopy": true, 00:13:47.760 "get_zone_info": false, 00:13:47.760 "zone_management": false, 00:13:47.760 "zone_append": false, 00:13:47.760 "compare": false, 00:13:47.760 "compare_and_write": false, 00:13:47.760 "abort": true, 00:13:47.760 "seek_hole": false, 00:13:47.760 "seek_data": false, 00:13:47.760 "copy": true, 00:13:47.760 "nvme_iov_md": false 00:13:47.760 }, 00:13:47.760 "memory_domains": [ 00:13:47.760 { 00:13:47.760 "dma_device_id": "system", 00:13:47.760 "dma_device_type": 1 00:13:47.760 }, 00:13:47.760 { 00:13:47.760 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.760 "dma_device_type": 2 00:13:47.760 } 00:13:47.760 ], 00:13:47.760 "driver_specific": {} 00:13:47.760 }' 00:13:47.760 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.019 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.019 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:48.019 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.019 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.019 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:48.019 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.019 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.019 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:48.019 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.279 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.279 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:48.279 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:48.279 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:48.279 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:48.540 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:48.540 "name": "BaseBdev2", 00:13:48.540 "aliases": [ 00:13:48.540 "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c" 00:13:48.540 ], 00:13:48.540 "product_name": "Malloc disk", 00:13:48.540 "block_size": 512, 00:13:48.540 "num_blocks": 65536, 00:13:48.540 "uuid": "5b43da74-c1ba-45b3-b98e-e02a4a82cf7c", 00:13:48.540 "assigned_rate_limits": { 00:13:48.540 "rw_ios_per_sec": 0, 00:13:48.540 "rw_mbytes_per_sec": 0, 00:13:48.540 "r_mbytes_per_sec": 0, 00:13:48.540 "w_mbytes_per_sec": 0 00:13:48.540 }, 00:13:48.540 "claimed": true, 00:13:48.540 "claim_type": "exclusive_write", 00:13:48.540 "zoned": false, 00:13:48.540 "supported_io_types": { 00:13:48.540 "read": true, 00:13:48.540 "write": true, 00:13:48.540 "unmap": true, 00:13:48.540 "flush": true, 00:13:48.540 "reset": true, 00:13:48.540 "nvme_admin": false, 00:13:48.540 "nvme_io": false, 00:13:48.540 "nvme_io_md": false, 00:13:48.540 "write_zeroes": true, 00:13:48.540 "zcopy": true, 00:13:48.540 "get_zone_info": false, 00:13:48.540 "zone_management": false, 00:13:48.540 "zone_append": false, 00:13:48.540 "compare": false, 00:13:48.540 "compare_and_write": false, 00:13:48.540 "abort": true, 00:13:48.540 "seek_hole": false, 00:13:48.540 "seek_data": false, 00:13:48.540 "copy": true, 00:13:48.540 "nvme_iov_md": false 00:13:48.540 }, 00:13:48.540 "memory_domains": [ 00:13:48.540 { 00:13:48.540 "dma_device_id": "system", 00:13:48.540 "dma_device_type": 1 00:13:48.540 }, 00:13:48.540 { 00:13:48.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.540 "dma_device_type": 2 00:13:48.540 } 00:13:48.540 ], 00:13:48.540 "driver_specific": {} 00:13:48.540 }' 00:13:48.540 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.540 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.540 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:48.540 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.540 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.540 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:48.540 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.540 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.800 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:48.800 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.800 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.800 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:48.800 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:48.800 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:48.800 17:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.059 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.059 "name": "BaseBdev3", 00:13:49.059 "aliases": [ 00:13:49.059 "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2" 00:13:49.059 ], 00:13:49.059 "product_name": "Malloc disk", 00:13:49.059 "block_size": 512, 00:13:49.059 "num_blocks": 65536, 00:13:49.059 "uuid": "ca3422f3-ed1f-4beb-bdc4-bc7bf222dfe2", 00:13:49.059 "assigned_rate_limits": { 00:13:49.059 "rw_ios_per_sec": 0, 00:13:49.059 "rw_mbytes_per_sec": 0, 00:13:49.059 "r_mbytes_per_sec": 0, 00:13:49.059 "w_mbytes_per_sec": 0 00:13:49.059 }, 00:13:49.059 "claimed": true, 00:13:49.059 "claim_type": "exclusive_write", 00:13:49.059 "zoned": false, 00:13:49.059 "supported_io_types": { 00:13:49.059 "read": true, 00:13:49.059 "write": true, 00:13:49.059 "unmap": true, 00:13:49.059 "flush": true, 00:13:49.059 "reset": true, 00:13:49.059 "nvme_admin": false, 00:13:49.060 "nvme_io": false, 00:13:49.060 "nvme_io_md": false, 00:13:49.060 "write_zeroes": true, 00:13:49.060 "zcopy": true, 00:13:49.060 "get_zone_info": false, 00:13:49.060 "zone_management": false, 00:13:49.060 "zone_append": false, 00:13:49.060 "compare": false, 00:13:49.060 "compare_and_write": false, 00:13:49.060 "abort": true, 00:13:49.060 "seek_hole": false, 00:13:49.060 "seek_data": false, 00:13:49.060 "copy": true, 00:13:49.060 "nvme_iov_md": false 00:13:49.060 }, 00:13:49.060 "memory_domains": [ 00:13:49.060 { 00:13:49.060 "dma_device_id": "system", 00:13:49.060 "dma_device_type": 1 00:13:49.060 }, 00:13:49.060 { 00:13:49.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.060 "dma_device_type": 2 00:13:49.060 } 00:13:49.060 ], 00:13:49.060 "driver_specific": {} 00:13:49.060 }' 00:13:49.060 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.060 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.060 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:49.060 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.060 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.060 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.060 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.320 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.320 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.320 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.320 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.320 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.320 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:49.580 [2024-07-15 17:26:00.706013] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:49.580 [2024-07-15 17:26:00.706029] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:49.580 [2024-07-15 17:26:00.706063] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:49.580 [2024-07-15 17:26:00.706098] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:49.580 [2024-07-15 17:26:00.706105] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b1250 name Existed_Raid, state offline 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2777002 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2777002 ']' 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2777002 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2777002 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2777002' 00:13:49.580 killing process with pid 2777002 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2777002 00:13:49.580 [2024-07-15 17:26:00.771609] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:49.580 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2777002 00:13:49.580 [2024-07-15 17:26:00.786257] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:49.841 00:13:49.841 real 0m24.094s 00:13:49.841 user 0m45.191s 00:13:49.841 sys 0m3.551s 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.841 ************************************ 00:13:49.841 END TEST raid_state_function_test 00:13:49.841 ************************************ 00:13:49.841 17:26:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:49.841 17:26:00 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:13:49.841 17:26:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:49.841 17:26:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:49.841 17:26:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:49.841 ************************************ 00:13:49.841 START TEST raid_state_function_test_sb 00:13:49.841 ************************************ 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2781734 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2781734' 00:13:49.841 Process raid pid: 2781734 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2781734 /var/tmp/spdk-raid.sock 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2781734 ']' 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:49.841 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:49.841 17:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:49.841 [2024-07-15 17:26:01.042690] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:13:49.841 [2024-07-15 17:26:01.042753] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:49.841 [2024-07-15 17:26:01.132456] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:50.100 [2024-07-15 17:26:01.200034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:50.100 [2024-07-15 17:26:01.241432] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:50.100 [2024-07-15 17:26:01.241454] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:50.668 17:26:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:50.668 17:26:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:50.668 17:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:50.927 [2024-07-15 17:26:02.056266] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:50.927 [2024-07-15 17:26:02.056296] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:50.927 [2024-07-15 17:26:02.056302] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:50.927 [2024-07-15 17:26:02.056308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:50.927 [2024-07-15 17:26:02.056313] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:50.927 [2024-07-15 17:26:02.056318] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.927 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:51.188 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:51.188 "name": "Existed_Raid", 00:13:51.188 "uuid": "77def313-e034-4dff-944f-557a4c5aa082", 00:13:51.188 "strip_size_kb": 64, 00:13:51.188 "state": "configuring", 00:13:51.188 "raid_level": "concat", 00:13:51.188 "superblock": true, 00:13:51.188 "num_base_bdevs": 3, 00:13:51.188 "num_base_bdevs_discovered": 0, 00:13:51.188 "num_base_bdevs_operational": 3, 00:13:51.188 "base_bdevs_list": [ 00:13:51.188 { 00:13:51.188 "name": "BaseBdev1", 00:13:51.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:51.188 "is_configured": false, 00:13:51.188 "data_offset": 0, 00:13:51.188 "data_size": 0 00:13:51.188 }, 00:13:51.188 { 00:13:51.188 "name": "BaseBdev2", 00:13:51.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:51.188 "is_configured": false, 00:13:51.188 "data_offset": 0, 00:13:51.188 "data_size": 0 00:13:51.188 }, 00:13:51.188 { 00:13:51.188 "name": "BaseBdev3", 00:13:51.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:51.188 "is_configured": false, 00:13:51.188 "data_offset": 0, 00:13:51.188 "data_size": 0 00:13:51.188 } 00:13:51.188 ] 00:13:51.188 }' 00:13:51.188 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:51.188 17:26:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:51.757 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:51.757 [2024-07-15 17:26:02.978475] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:51.757 [2024-07-15 17:26:02.978491] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16d06d0 name Existed_Raid, state configuring 00:13:51.757 17:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:52.017 [2024-07-15 17:26:03.170985] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:52.017 [2024-07-15 17:26:03.171003] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:52.017 [2024-07-15 17:26:03.171008] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:52.017 [2024-07-15 17:26:03.171014] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:52.017 [2024-07-15 17:26:03.171018] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:52.017 [2024-07-15 17:26:03.171023] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:52.017 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:52.277 [2024-07-15 17:26:03.370137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:52.277 BaseBdev1 00:13:52.277 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:52.277 17:26:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:52.277 17:26:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:52.277 17:26:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:52.277 17:26:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:52.277 17:26:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:52.277 17:26:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:52.537 [ 00:13:52.537 { 00:13:52.537 "name": "BaseBdev1", 00:13:52.537 "aliases": [ 00:13:52.537 "4b0961f7-2220-4cdf-bb1c-018683dc6d9d" 00:13:52.537 ], 00:13:52.537 "product_name": "Malloc disk", 00:13:52.537 "block_size": 512, 00:13:52.537 "num_blocks": 65536, 00:13:52.537 "uuid": "4b0961f7-2220-4cdf-bb1c-018683dc6d9d", 00:13:52.537 "assigned_rate_limits": { 00:13:52.537 "rw_ios_per_sec": 0, 00:13:52.537 "rw_mbytes_per_sec": 0, 00:13:52.537 "r_mbytes_per_sec": 0, 00:13:52.537 "w_mbytes_per_sec": 0 00:13:52.537 }, 00:13:52.537 "claimed": true, 00:13:52.537 "claim_type": "exclusive_write", 00:13:52.537 "zoned": false, 00:13:52.537 "supported_io_types": { 00:13:52.537 "read": true, 00:13:52.537 "write": true, 00:13:52.537 "unmap": true, 00:13:52.537 "flush": true, 00:13:52.537 "reset": true, 00:13:52.537 "nvme_admin": false, 00:13:52.537 "nvme_io": false, 00:13:52.537 "nvme_io_md": false, 00:13:52.537 "write_zeroes": true, 00:13:52.537 "zcopy": true, 00:13:52.537 "get_zone_info": false, 00:13:52.537 "zone_management": false, 00:13:52.537 "zone_append": false, 00:13:52.537 "compare": false, 00:13:52.537 "compare_and_write": false, 00:13:52.537 "abort": true, 00:13:52.537 "seek_hole": false, 00:13:52.537 "seek_data": false, 00:13:52.537 "copy": true, 00:13:52.537 "nvme_iov_md": false 00:13:52.537 }, 00:13:52.537 "memory_domains": [ 00:13:52.537 { 00:13:52.537 "dma_device_id": "system", 00:13:52.537 "dma_device_type": 1 00:13:52.537 }, 00:13:52.537 { 00:13:52.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.537 "dma_device_type": 2 00:13:52.537 } 00:13:52.537 ], 00:13:52.537 "driver_specific": {} 00:13:52.537 } 00:13:52.537 ] 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.537 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.797 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.797 "name": "Existed_Raid", 00:13:52.797 "uuid": "69407f85-5529-4e06-8970-07f0ad01d129", 00:13:52.797 "strip_size_kb": 64, 00:13:52.797 "state": "configuring", 00:13:52.797 "raid_level": "concat", 00:13:52.797 "superblock": true, 00:13:52.797 "num_base_bdevs": 3, 00:13:52.797 "num_base_bdevs_discovered": 1, 00:13:52.797 "num_base_bdevs_operational": 3, 00:13:52.797 "base_bdevs_list": [ 00:13:52.797 { 00:13:52.797 "name": "BaseBdev1", 00:13:52.797 "uuid": "4b0961f7-2220-4cdf-bb1c-018683dc6d9d", 00:13:52.797 "is_configured": true, 00:13:52.797 "data_offset": 2048, 00:13:52.797 "data_size": 63488 00:13:52.797 }, 00:13:52.797 { 00:13:52.797 "name": "BaseBdev2", 00:13:52.797 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.797 "is_configured": false, 00:13:52.797 "data_offset": 0, 00:13:52.797 "data_size": 0 00:13:52.797 }, 00:13:52.797 { 00:13:52.797 "name": "BaseBdev3", 00:13:52.797 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.797 "is_configured": false, 00:13:52.797 "data_offset": 0, 00:13:52.797 "data_size": 0 00:13:52.797 } 00:13:52.797 ] 00:13:52.797 }' 00:13:52.797 17:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.797 17:26:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:53.367 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:53.628 [2024-07-15 17:26:04.705505] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:53.628 [2024-07-15 17:26:04.705533] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16cffa0 name Existed_Raid, state configuring 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:53.628 [2024-07-15 17:26:04.902036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:53.628 [2024-07-15 17:26:04.903172] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:53.628 [2024-07-15 17:26:04.903197] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:53.628 [2024-07-15 17:26:04.903203] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:53.628 [2024-07-15 17:26:04.903208] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.628 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.889 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.889 17:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.889 17:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.889 "name": "Existed_Raid", 00:13:53.889 "uuid": "ee417914-70ce-48f2-877c-92ac2e312fd3", 00:13:53.889 "strip_size_kb": 64, 00:13:53.889 "state": "configuring", 00:13:53.889 "raid_level": "concat", 00:13:53.889 "superblock": true, 00:13:53.889 "num_base_bdevs": 3, 00:13:53.889 "num_base_bdevs_discovered": 1, 00:13:53.889 "num_base_bdevs_operational": 3, 00:13:53.889 "base_bdevs_list": [ 00:13:53.889 { 00:13:53.889 "name": "BaseBdev1", 00:13:53.889 "uuid": "4b0961f7-2220-4cdf-bb1c-018683dc6d9d", 00:13:53.889 "is_configured": true, 00:13:53.889 "data_offset": 2048, 00:13:53.889 "data_size": 63488 00:13:53.889 }, 00:13:53.889 { 00:13:53.889 "name": "BaseBdev2", 00:13:53.889 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.889 "is_configured": false, 00:13:53.889 "data_offset": 0, 00:13:53.889 "data_size": 0 00:13:53.889 }, 00:13:53.889 { 00:13:53.889 "name": "BaseBdev3", 00:13:53.889 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.889 "is_configured": false, 00:13:53.889 "data_offset": 0, 00:13:53.889 "data_size": 0 00:13:53.889 } 00:13:53.889 ] 00:13:53.889 }' 00:13:53.889 17:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.889 17:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:54.458 17:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:54.718 [2024-07-15 17:26:05.833008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:54.718 BaseBdev2 00:13:54.718 17:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:54.718 17:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:54.718 17:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:54.718 17:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:54.718 17:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:54.718 17:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:54.718 17:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:54.978 [ 00:13:54.978 { 00:13:54.978 "name": "BaseBdev2", 00:13:54.978 "aliases": [ 00:13:54.978 "97a69b8d-6303-4657-8f17-db5a5ad37e59" 00:13:54.978 ], 00:13:54.978 "product_name": "Malloc disk", 00:13:54.978 "block_size": 512, 00:13:54.978 "num_blocks": 65536, 00:13:54.978 "uuid": "97a69b8d-6303-4657-8f17-db5a5ad37e59", 00:13:54.978 "assigned_rate_limits": { 00:13:54.978 "rw_ios_per_sec": 0, 00:13:54.978 "rw_mbytes_per_sec": 0, 00:13:54.978 "r_mbytes_per_sec": 0, 00:13:54.978 "w_mbytes_per_sec": 0 00:13:54.978 }, 00:13:54.978 "claimed": true, 00:13:54.978 "claim_type": "exclusive_write", 00:13:54.978 "zoned": false, 00:13:54.978 "supported_io_types": { 00:13:54.978 "read": true, 00:13:54.978 "write": true, 00:13:54.978 "unmap": true, 00:13:54.978 "flush": true, 00:13:54.978 "reset": true, 00:13:54.978 "nvme_admin": false, 00:13:54.978 "nvme_io": false, 00:13:54.978 "nvme_io_md": false, 00:13:54.978 "write_zeroes": true, 00:13:54.978 "zcopy": true, 00:13:54.978 "get_zone_info": false, 00:13:54.978 "zone_management": false, 00:13:54.978 "zone_append": false, 00:13:54.978 "compare": false, 00:13:54.978 "compare_and_write": false, 00:13:54.978 "abort": true, 00:13:54.978 "seek_hole": false, 00:13:54.978 "seek_data": false, 00:13:54.978 "copy": true, 00:13:54.978 "nvme_iov_md": false 00:13:54.978 }, 00:13:54.978 "memory_domains": [ 00:13:54.978 { 00:13:54.978 "dma_device_id": "system", 00:13:54.978 "dma_device_type": 1 00:13:54.978 }, 00:13:54.978 { 00:13:54.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.978 "dma_device_type": 2 00:13:54.978 } 00:13:54.978 ], 00:13:54.978 "driver_specific": {} 00:13:54.978 } 00:13:54.978 ] 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.978 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.238 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.238 "name": "Existed_Raid", 00:13:55.238 "uuid": "ee417914-70ce-48f2-877c-92ac2e312fd3", 00:13:55.238 "strip_size_kb": 64, 00:13:55.238 "state": "configuring", 00:13:55.238 "raid_level": "concat", 00:13:55.238 "superblock": true, 00:13:55.238 "num_base_bdevs": 3, 00:13:55.238 "num_base_bdevs_discovered": 2, 00:13:55.238 "num_base_bdevs_operational": 3, 00:13:55.238 "base_bdevs_list": [ 00:13:55.238 { 00:13:55.238 "name": "BaseBdev1", 00:13:55.238 "uuid": "4b0961f7-2220-4cdf-bb1c-018683dc6d9d", 00:13:55.238 "is_configured": true, 00:13:55.238 "data_offset": 2048, 00:13:55.238 "data_size": 63488 00:13:55.238 }, 00:13:55.238 { 00:13:55.238 "name": "BaseBdev2", 00:13:55.238 "uuid": "97a69b8d-6303-4657-8f17-db5a5ad37e59", 00:13:55.238 "is_configured": true, 00:13:55.238 "data_offset": 2048, 00:13:55.238 "data_size": 63488 00:13:55.238 }, 00:13:55.238 { 00:13:55.238 "name": "BaseBdev3", 00:13:55.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.238 "is_configured": false, 00:13:55.238 "data_offset": 0, 00:13:55.238 "data_size": 0 00:13:55.238 } 00:13:55.238 ] 00:13:55.238 }' 00:13:55.238 17:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.238 17:26:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:55.807 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:56.067 [2024-07-15 17:26:07.189240] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:56.067 [2024-07-15 17:26:07.189356] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16d0e90 00:13:56.067 [2024-07-15 17:26:07.189365] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:56.067 [2024-07-15 17:26:07.189501] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16d0b60 00:13:56.067 [2024-07-15 17:26:07.189592] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16d0e90 00:13:56.067 [2024-07-15 17:26:07.189598] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16d0e90 00:13:56.067 [2024-07-15 17:26:07.189664] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:56.067 BaseBdev3 00:13:56.067 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:56.067 17:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:56.067 17:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:56.067 17:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:56.067 17:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:56.067 17:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:56.067 17:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:56.326 [ 00:13:56.326 { 00:13:56.326 "name": "BaseBdev3", 00:13:56.326 "aliases": [ 00:13:56.326 "39845a1c-6c5a-452e-91fb-9cd0d7275720" 00:13:56.326 ], 00:13:56.326 "product_name": "Malloc disk", 00:13:56.326 "block_size": 512, 00:13:56.326 "num_blocks": 65536, 00:13:56.326 "uuid": "39845a1c-6c5a-452e-91fb-9cd0d7275720", 00:13:56.326 "assigned_rate_limits": { 00:13:56.326 "rw_ios_per_sec": 0, 00:13:56.326 "rw_mbytes_per_sec": 0, 00:13:56.326 "r_mbytes_per_sec": 0, 00:13:56.326 "w_mbytes_per_sec": 0 00:13:56.326 }, 00:13:56.326 "claimed": true, 00:13:56.326 "claim_type": "exclusive_write", 00:13:56.326 "zoned": false, 00:13:56.326 "supported_io_types": { 00:13:56.326 "read": true, 00:13:56.326 "write": true, 00:13:56.326 "unmap": true, 00:13:56.326 "flush": true, 00:13:56.326 "reset": true, 00:13:56.326 "nvme_admin": false, 00:13:56.326 "nvme_io": false, 00:13:56.326 "nvme_io_md": false, 00:13:56.326 "write_zeroes": true, 00:13:56.326 "zcopy": true, 00:13:56.326 "get_zone_info": false, 00:13:56.326 "zone_management": false, 00:13:56.326 "zone_append": false, 00:13:56.326 "compare": false, 00:13:56.326 "compare_and_write": false, 00:13:56.326 "abort": true, 00:13:56.326 "seek_hole": false, 00:13:56.326 "seek_data": false, 00:13:56.326 "copy": true, 00:13:56.326 "nvme_iov_md": false 00:13:56.326 }, 00:13:56.326 "memory_domains": [ 00:13:56.326 { 00:13:56.326 "dma_device_id": "system", 00:13:56.326 "dma_device_type": 1 00:13:56.326 }, 00:13:56.326 { 00:13:56.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.326 "dma_device_type": 2 00:13:56.326 } 00:13:56.326 ], 00:13:56.326 "driver_specific": {} 00:13:56.326 } 00:13:56.326 ] 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.326 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.327 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.327 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.327 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.586 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.586 "name": "Existed_Raid", 00:13:56.586 "uuid": "ee417914-70ce-48f2-877c-92ac2e312fd3", 00:13:56.586 "strip_size_kb": 64, 00:13:56.586 "state": "online", 00:13:56.586 "raid_level": "concat", 00:13:56.586 "superblock": true, 00:13:56.586 "num_base_bdevs": 3, 00:13:56.586 "num_base_bdevs_discovered": 3, 00:13:56.586 "num_base_bdevs_operational": 3, 00:13:56.586 "base_bdevs_list": [ 00:13:56.586 { 00:13:56.586 "name": "BaseBdev1", 00:13:56.586 "uuid": "4b0961f7-2220-4cdf-bb1c-018683dc6d9d", 00:13:56.586 "is_configured": true, 00:13:56.586 "data_offset": 2048, 00:13:56.586 "data_size": 63488 00:13:56.586 }, 00:13:56.586 { 00:13:56.586 "name": "BaseBdev2", 00:13:56.586 "uuid": "97a69b8d-6303-4657-8f17-db5a5ad37e59", 00:13:56.586 "is_configured": true, 00:13:56.586 "data_offset": 2048, 00:13:56.586 "data_size": 63488 00:13:56.586 }, 00:13:56.586 { 00:13:56.586 "name": "BaseBdev3", 00:13:56.586 "uuid": "39845a1c-6c5a-452e-91fb-9cd0d7275720", 00:13:56.586 "is_configured": true, 00:13:56.586 "data_offset": 2048, 00:13:56.586 "data_size": 63488 00:13:56.586 } 00:13:56.586 ] 00:13:56.586 }' 00:13:56.586 17:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.586 17:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.156 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:57.156 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:57.156 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:57.156 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:57.156 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:57.156 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:57.156 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:57.156 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:57.416 [2024-07-15 17:26:08.500824] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:57.416 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:57.416 "name": "Existed_Raid", 00:13:57.416 "aliases": [ 00:13:57.416 "ee417914-70ce-48f2-877c-92ac2e312fd3" 00:13:57.416 ], 00:13:57.416 "product_name": "Raid Volume", 00:13:57.416 "block_size": 512, 00:13:57.416 "num_blocks": 190464, 00:13:57.416 "uuid": "ee417914-70ce-48f2-877c-92ac2e312fd3", 00:13:57.416 "assigned_rate_limits": { 00:13:57.416 "rw_ios_per_sec": 0, 00:13:57.416 "rw_mbytes_per_sec": 0, 00:13:57.416 "r_mbytes_per_sec": 0, 00:13:57.416 "w_mbytes_per_sec": 0 00:13:57.416 }, 00:13:57.416 "claimed": false, 00:13:57.416 "zoned": false, 00:13:57.416 "supported_io_types": { 00:13:57.416 "read": true, 00:13:57.416 "write": true, 00:13:57.416 "unmap": true, 00:13:57.416 "flush": true, 00:13:57.416 "reset": true, 00:13:57.416 "nvme_admin": false, 00:13:57.416 "nvme_io": false, 00:13:57.416 "nvme_io_md": false, 00:13:57.416 "write_zeroes": true, 00:13:57.416 "zcopy": false, 00:13:57.416 "get_zone_info": false, 00:13:57.416 "zone_management": false, 00:13:57.416 "zone_append": false, 00:13:57.416 "compare": false, 00:13:57.416 "compare_and_write": false, 00:13:57.416 "abort": false, 00:13:57.416 "seek_hole": false, 00:13:57.416 "seek_data": false, 00:13:57.416 "copy": false, 00:13:57.416 "nvme_iov_md": false 00:13:57.416 }, 00:13:57.416 "memory_domains": [ 00:13:57.416 { 00:13:57.416 "dma_device_id": "system", 00:13:57.416 "dma_device_type": 1 00:13:57.416 }, 00:13:57.416 { 00:13:57.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.416 "dma_device_type": 2 00:13:57.416 }, 00:13:57.416 { 00:13:57.416 "dma_device_id": "system", 00:13:57.416 "dma_device_type": 1 00:13:57.416 }, 00:13:57.416 { 00:13:57.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.416 "dma_device_type": 2 00:13:57.416 }, 00:13:57.416 { 00:13:57.416 "dma_device_id": "system", 00:13:57.416 "dma_device_type": 1 00:13:57.416 }, 00:13:57.416 { 00:13:57.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.416 "dma_device_type": 2 00:13:57.416 } 00:13:57.416 ], 00:13:57.416 "driver_specific": { 00:13:57.416 "raid": { 00:13:57.416 "uuid": "ee417914-70ce-48f2-877c-92ac2e312fd3", 00:13:57.416 "strip_size_kb": 64, 00:13:57.416 "state": "online", 00:13:57.416 "raid_level": "concat", 00:13:57.416 "superblock": true, 00:13:57.416 "num_base_bdevs": 3, 00:13:57.416 "num_base_bdevs_discovered": 3, 00:13:57.416 "num_base_bdevs_operational": 3, 00:13:57.416 "base_bdevs_list": [ 00:13:57.416 { 00:13:57.416 "name": "BaseBdev1", 00:13:57.416 "uuid": "4b0961f7-2220-4cdf-bb1c-018683dc6d9d", 00:13:57.416 "is_configured": true, 00:13:57.416 "data_offset": 2048, 00:13:57.416 "data_size": 63488 00:13:57.416 }, 00:13:57.416 { 00:13:57.416 "name": "BaseBdev2", 00:13:57.416 "uuid": "97a69b8d-6303-4657-8f17-db5a5ad37e59", 00:13:57.416 "is_configured": true, 00:13:57.416 "data_offset": 2048, 00:13:57.416 "data_size": 63488 00:13:57.416 }, 00:13:57.416 { 00:13:57.416 "name": "BaseBdev3", 00:13:57.416 "uuid": "39845a1c-6c5a-452e-91fb-9cd0d7275720", 00:13:57.416 "is_configured": true, 00:13:57.416 "data_offset": 2048, 00:13:57.416 "data_size": 63488 00:13:57.416 } 00:13:57.416 ] 00:13:57.416 } 00:13:57.416 } 00:13:57.416 }' 00:13:57.416 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:57.416 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:57.416 BaseBdev2 00:13:57.416 BaseBdev3' 00:13:57.416 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:57.416 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:57.416 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:57.676 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:57.676 "name": "BaseBdev1", 00:13:57.676 "aliases": [ 00:13:57.676 "4b0961f7-2220-4cdf-bb1c-018683dc6d9d" 00:13:57.676 ], 00:13:57.676 "product_name": "Malloc disk", 00:13:57.676 "block_size": 512, 00:13:57.676 "num_blocks": 65536, 00:13:57.676 "uuid": "4b0961f7-2220-4cdf-bb1c-018683dc6d9d", 00:13:57.676 "assigned_rate_limits": { 00:13:57.676 "rw_ios_per_sec": 0, 00:13:57.676 "rw_mbytes_per_sec": 0, 00:13:57.676 "r_mbytes_per_sec": 0, 00:13:57.676 "w_mbytes_per_sec": 0 00:13:57.676 }, 00:13:57.676 "claimed": true, 00:13:57.676 "claim_type": "exclusive_write", 00:13:57.676 "zoned": false, 00:13:57.676 "supported_io_types": { 00:13:57.676 "read": true, 00:13:57.676 "write": true, 00:13:57.676 "unmap": true, 00:13:57.676 "flush": true, 00:13:57.676 "reset": true, 00:13:57.676 "nvme_admin": false, 00:13:57.676 "nvme_io": false, 00:13:57.676 "nvme_io_md": false, 00:13:57.676 "write_zeroes": true, 00:13:57.676 "zcopy": true, 00:13:57.676 "get_zone_info": false, 00:13:57.676 "zone_management": false, 00:13:57.676 "zone_append": false, 00:13:57.676 "compare": false, 00:13:57.676 "compare_and_write": false, 00:13:57.676 "abort": true, 00:13:57.676 "seek_hole": false, 00:13:57.676 "seek_data": false, 00:13:57.676 "copy": true, 00:13:57.676 "nvme_iov_md": false 00:13:57.676 }, 00:13:57.676 "memory_domains": [ 00:13:57.676 { 00:13:57.676 "dma_device_id": "system", 00:13:57.676 "dma_device_type": 1 00:13:57.676 }, 00:13:57.676 { 00:13:57.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.676 "dma_device_type": 2 00:13:57.676 } 00:13:57.676 ], 00:13:57.676 "driver_specific": {} 00:13:57.676 }' 00:13:57.676 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:57.676 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:57.676 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:57.676 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:57.676 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:57.676 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:57.676 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:57.676 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:57.936 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:57.936 17:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:57.936 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:57.937 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:57.937 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:57.937 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:57.937 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:58.196 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:58.196 "name": "BaseBdev2", 00:13:58.196 "aliases": [ 00:13:58.196 "97a69b8d-6303-4657-8f17-db5a5ad37e59" 00:13:58.196 ], 00:13:58.196 "product_name": "Malloc disk", 00:13:58.196 "block_size": 512, 00:13:58.196 "num_blocks": 65536, 00:13:58.196 "uuid": "97a69b8d-6303-4657-8f17-db5a5ad37e59", 00:13:58.196 "assigned_rate_limits": { 00:13:58.196 "rw_ios_per_sec": 0, 00:13:58.196 "rw_mbytes_per_sec": 0, 00:13:58.196 "r_mbytes_per_sec": 0, 00:13:58.196 "w_mbytes_per_sec": 0 00:13:58.196 }, 00:13:58.196 "claimed": true, 00:13:58.196 "claim_type": "exclusive_write", 00:13:58.196 "zoned": false, 00:13:58.196 "supported_io_types": { 00:13:58.196 "read": true, 00:13:58.196 "write": true, 00:13:58.196 "unmap": true, 00:13:58.196 "flush": true, 00:13:58.196 "reset": true, 00:13:58.196 "nvme_admin": false, 00:13:58.196 "nvme_io": false, 00:13:58.196 "nvme_io_md": false, 00:13:58.196 "write_zeroes": true, 00:13:58.196 "zcopy": true, 00:13:58.196 "get_zone_info": false, 00:13:58.196 "zone_management": false, 00:13:58.196 "zone_append": false, 00:13:58.196 "compare": false, 00:13:58.196 "compare_and_write": false, 00:13:58.196 "abort": true, 00:13:58.196 "seek_hole": false, 00:13:58.196 "seek_data": false, 00:13:58.196 "copy": true, 00:13:58.196 "nvme_iov_md": false 00:13:58.196 }, 00:13:58.196 "memory_domains": [ 00:13:58.196 { 00:13:58.196 "dma_device_id": "system", 00:13:58.196 "dma_device_type": 1 00:13:58.196 }, 00:13:58.196 { 00:13:58.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.197 "dma_device_type": 2 00:13:58.197 } 00:13:58.197 ], 00:13:58.197 "driver_specific": {} 00:13:58.197 }' 00:13:58.197 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.197 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.197 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:58.197 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.197 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.197 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:58.197 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:58.456 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:58.456 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:58.456 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:58.456 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:58.456 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:58.456 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:58.456 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:58.456 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:58.716 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:58.716 "name": "BaseBdev3", 00:13:58.716 "aliases": [ 00:13:58.716 "39845a1c-6c5a-452e-91fb-9cd0d7275720" 00:13:58.716 ], 00:13:58.716 "product_name": "Malloc disk", 00:13:58.716 "block_size": 512, 00:13:58.716 "num_blocks": 65536, 00:13:58.716 "uuid": "39845a1c-6c5a-452e-91fb-9cd0d7275720", 00:13:58.716 "assigned_rate_limits": { 00:13:58.716 "rw_ios_per_sec": 0, 00:13:58.716 "rw_mbytes_per_sec": 0, 00:13:58.716 "r_mbytes_per_sec": 0, 00:13:58.716 "w_mbytes_per_sec": 0 00:13:58.716 }, 00:13:58.716 "claimed": true, 00:13:58.716 "claim_type": "exclusive_write", 00:13:58.716 "zoned": false, 00:13:58.716 "supported_io_types": { 00:13:58.716 "read": true, 00:13:58.716 "write": true, 00:13:58.716 "unmap": true, 00:13:58.716 "flush": true, 00:13:58.716 "reset": true, 00:13:58.716 "nvme_admin": false, 00:13:58.716 "nvme_io": false, 00:13:58.716 "nvme_io_md": false, 00:13:58.716 "write_zeroes": true, 00:13:58.716 "zcopy": true, 00:13:58.716 "get_zone_info": false, 00:13:58.716 "zone_management": false, 00:13:58.716 "zone_append": false, 00:13:58.716 "compare": false, 00:13:58.716 "compare_and_write": false, 00:13:58.716 "abort": true, 00:13:58.716 "seek_hole": false, 00:13:58.716 "seek_data": false, 00:13:58.716 "copy": true, 00:13:58.716 "nvme_iov_md": false 00:13:58.716 }, 00:13:58.716 "memory_domains": [ 00:13:58.716 { 00:13:58.716 "dma_device_id": "system", 00:13:58.716 "dma_device_type": 1 00:13:58.716 }, 00:13:58.716 { 00:13:58.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.716 "dma_device_type": 2 00:13:58.716 } 00:13:58.716 ], 00:13:58.716 "driver_specific": {} 00:13:58.716 }' 00:13:58.716 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.716 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.716 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:58.716 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.716 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.716 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:58.716 17:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:58.975 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:58.975 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:58.975 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:58.975 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:58.975 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:58.975 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:59.235 [2024-07-15 17:26:10.381393] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:59.235 [2024-07-15 17:26:10.381413] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:59.235 [2024-07-15 17:26:10.381445] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.235 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.495 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.495 "name": "Existed_Raid", 00:13:59.495 "uuid": "ee417914-70ce-48f2-877c-92ac2e312fd3", 00:13:59.495 "strip_size_kb": 64, 00:13:59.495 "state": "offline", 00:13:59.495 "raid_level": "concat", 00:13:59.495 "superblock": true, 00:13:59.495 "num_base_bdevs": 3, 00:13:59.495 "num_base_bdevs_discovered": 2, 00:13:59.495 "num_base_bdevs_operational": 2, 00:13:59.495 "base_bdevs_list": [ 00:13:59.495 { 00:13:59.495 "name": null, 00:13:59.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.495 "is_configured": false, 00:13:59.495 "data_offset": 2048, 00:13:59.495 "data_size": 63488 00:13:59.495 }, 00:13:59.495 { 00:13:59.495 "name": "BaseBdev2", 00:13:59.495 "uuid": "97a69b8d-6303-4657-8f17-db5a5ad37e59", 00:13:59.495 "is_configured": true, 00:13:59.495 "data_offset": 2048, 00:13:59.495 "data_size": 63488 00:13:59.495 }, 00:13:59.495 { 00:13:59.495 "name": "BaseBdev3", 00:13:59.495 "uuid": "39845a1c-6c5a-452e-91fb-9cd0d7275720", 00:13:59.495 "is_configured": true, 00:13:59.495 "data_offset": 2048, 00:13:59.495 "data_size": 63488 00:13:59.495 } 00:13:59.495 ] 00:13:59.495 }' 00:13:59.495 17:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.495 17:26:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:00.066 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:00.066 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:00.066 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.066 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:00.066 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:00.066 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:00.066 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:00.327 [2024-07-15 17:26:11.496200] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:00.327 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:00.327 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:00.327 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.327 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:00.587 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:00.587 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:00.587 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:00.587 [2024-07-15 17:26:11.866967] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:00.587 [2024-07-15 17:26:11.866996] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16d0e90 name Existed_Raid, state offline 00:14:00.587 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:00.587 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:00.847 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.847 17:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:00.847 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:00.847 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:00.847 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:00.847 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:00.847 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:00.847 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:01.107 BaseBdev2 00:14:01.108 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:01.108 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:01.108 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:01.108 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:01.108 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:01.108 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:01.108 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:01.368 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:01.368 [ 00:14:01.368 { 00:14:01.368 "name": "BaseBdev2", 00:14:01.368 "aliases": [ 00:14:01.368 "13e7abde-f48c-4467-8e27-a0f89a69990d" 00:14:01.368 ], 00:14:01.368 "product_name": "Malloc disk", 00:14:01.368 "block_size": 512, 00:14:01.368 "num_blocks": 65536, 00:14:01.368 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:01.368 "assigned_rate_limits": { 00:14:01.368 "rw_ios_per_sec": 0, 00:14:01.368 "rw_mbytes_per_sec": 0, 00:14:01.368 "r_mbytes_per_sec": 0, 00:14:01.368 "w_mbytes_per_sec": 0 00:14:01.368 }, 00:14:01.368 "claimed": false, 00:14:01.368 "zoned": false, 00:14:01.368 "supported_io_types": { 00:14:01.368 "read": true, 00:14:01.368 "write": true, 00:14:01.368 "unmap": true, 00:14:01.368 "flush": true, 00:14:01.368 "reset": true, 00:14:01.368 "nvme_admin": false, 00:14:01.368 "nvme_io": false, 00:14:01.368 "nvme_io_md": false, 00:14:01.368 "write_zeroes": true, 00:14:01.368 "zcopy": true, 00:14:01.368 "get_zone_info": false, 00:14:01.368 "zone_management": false, 00:14:01.368 "zone_append": false, 00:14:01.368 "compare": false, 00:14:01.368 "compare_and_write": false, 00:14:01.368 "abort": true, 00:14:01.368 "seek_hole": false, 00:14:01.368 "seek_data": false, 00:14:01.368 "copy": true, 00:14:01.368 "nvme_iov_md": false 00:14:01.368 }, 00:14:01.368 "memory_domains": [ 00:14:01.368 { 00:14:01.368 "dma_device_id": "system", 00:14:01.368 "dma_device_type": 1 00:14:01.368 }, 00:14:01.368 { 00:14:01.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.368 "dma_device_type": 2 00:14:01.368 } 00:14:01.368 ], 00:14:01.368 "driver_specific": {} 00:14:01.368 } 00:14:01.368 ] 00:14:01.368 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:01.368 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:01.368 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:01.368 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:01.629 BaseBdev3 00:14:01.629 17:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:01.629 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:01.629 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:01.629 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:01.629 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:01.629 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:01.629 17:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:01.889 17:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:02.185 [ 00:14:02.185 { 00:14:02.185 "name": "BaseBdev3", 00:14:02.185 "aliases": [ 00:14:02.185 "14cf507c-9c3e-4301-94be-68555cf84106" 00:14:02.185 ], 00:14:02.185 "product_name": "Malloc disk", 00:14:02.185 "block_size": 512, 00:14:02.185 "num_blocks": 65536, 00:14:02.185 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:02.185 "assigned_rate_limits": { 00:14:02.185 "rw_ios_per_sec": 0, 00:14:02.185 "rw_mbytes_per_sec": 0, 00:14:02.185 "r_mbytes_per_sec": 0, 00:14:02.185 "w_mbytes_per_sec": 0 00:14:02.185 }, 00:14:02.185 "claimed": false, 00:14:02.185 "zoned": false, 00:14:02.185 "supported_io_types": { 00:14:02.185 "read": true, 00:14:02.185 "write": true, 00:14:02.185 "unmap": true, 00:14:02.185 "flush": true, 00:14:02.185 "reset": true, 00:14:02.185 "nvme_admin": false, 00:14:02.185 "nvme_io": false, 00:14:02.185 "nvme_io_md": false, 00:14:02.185 "write_zeroes": true, 00:14:02.185 "zcopy": true, 00:14:02.185 "get_zone_info": false, 00:14:02.185 "zone_management": false, 00:14:02.185 "zone_append": false, 00:14:02.185 "compare": false, 00:14:02.185 "compare_and_write": false, 00:14:02.185 "abort": true, 00:14:02.185 "seek_hole": false, 00:14:02.185 "seek_data": false, 00:14:02.185 "copy": true, 00:14:02.185 "nvme_iov_md": false 00:14:02.185 }, 00:14:02.185 "memory_domains": [ 00:14:02.185 { 00:14:02.185 "dma_device_id": "system", 00:14:02.185 "dma_device_type": 1 00:14:02.185 }, 00:14:02.185 { 00:14:02.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.185 "dma_device_type": 2 00:14:02.185 } 00:14:02.185 ], 00:14:02.185 "driver_specific": {} 00:14:02.185 } 00:14:02.185 ] 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:02.185 [2024-07-15 17:26:13.382732] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:02.185 [2024-07-15 17:26:13.382769] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:02.185 [2024-07-15 17:26:13.382787] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:02.185 [2024-07-15 17:26:13.383849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.185 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.445 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.445 "name": "Existed_Raid", 00:14:02.445 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:02.445 "strip_size_kb": 64, 00:14:02.445 "state": "configuring", 00:14:02.445 "raid_level": "concat", 00:14:02.445 "superblock": true, 00:14:02.445 "num_base_bdevs": 3, 00:14:02.445 "num_base_bdevs_discovered": 2, 00:14:02.445 "num_base_bdevs_operational": 3, 00:14:02.445 "base_bdevs_list": [ 00:14:02.445 { 00:14:02.445 "name": "BaseBdev1", 00:14:02.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:02.445 "is_configured": false, 00:14:02.445 "data_offset": 0, 00:14:02.445 "data_size": 0 00:14:02.445 }, 00:14:02.445 { 00:14:02.445 "name": "BaseBdev2", 00:14:02.445 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:02.445 "is_configured": true, 00:14:02.445 "data_offset": 2048, 00:14:02.445 "data_size": 63488 00:14:02.445 }, 00:14:02.445 { 00:14:02.445 "name": "BaseBdev3", 00:14:02.445 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:02.445 "is_configured": true, 00:14:02.445 "data_offset": 2048, 00:14:02.445 "data_size": 63488 00:14:02.445 } 00:14:02.445 ] 00:14:02.445 }' 00:14:02.445 17:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.445 17:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.016 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:03.276 [2024-07-15 17:26:14.329105] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.276 "name": "Existed_Raid", 00:14:03.276 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:03.276 "strip_size_kb": 64, 00:14:03.276 "state": "configuring", 00:14:03.276 "raid_level": "concat", 00:14:03.276 "superblock": true, 00:14:03.276 "num_base_bdevs": 3, 00:14:03.276 "num_base_bdevs_discovered": 1, 00:14:03.276 "num_base_bdevs_operational": 3, 00:14:03.276 "base_bdevs_list": [ 00:14:03.276 { 00:14:03.276 "name": "BaseBdev1", 00:14:03.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.276 "is_configured": false, 00:14:03.276 "data_offset": 0, 00:14:03.276 "data_size": 0 00:14:03.276 }, 00:14:03.276 { 00:14:03.276 "name": null, 00:14:03.276 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:03.276 "is_configured": false, 00:14:03.276 "data_offset": 2048, 00:14:03.276 "data_size": 63488 00:14:03.276 }, 00:14:03.276 { 00:14:03.276 "name": "BaseBdev3", 00:14:03.276 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:03.276 "is_configured": true, 00:14:03.276 "data_offset": 2048, 00:14:03.276 "data_size": 63488 00:14:03.276 } 00:14:03.276 ] 00:14:03.276 }' 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.276 17:26:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.848 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.848 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:04.109 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:04.109 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:04.370 [2024-07-15 17:26:15.452905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:04.370 BaseBdev1 00:14:04.370 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:04.370 17:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:04.370 17:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:04.370 17:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:04.370 17:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:04.370 17:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:04.370 17:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.370 17:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:04.630 [ 00:14:04.630 { 00:14:04.630 "name": "BaseBdev1", 00:14:04.630 "aliases": [ 00:14:04.630 "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f" 00:14:04.630 ], 00:14:04.630 "product_name": "Malloc disk", 00:14:04.630 "block_size": 512, 00:14:04.630 "num_blocks": 65536, 00:14:04.630 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:04.630 "assigned_rate_limits": { 00:14:04.630 "rw_ios_per_sec": 0, 00:14:04.630 "rw_mbytes_per_sec": 0, 00:14:04.630 "r_mbytes_per_sec": 0, 00:14:04.630 "w_mbytes_per_sec": 0 00:14:04.630 }, 00:14:04.630 "claimed": true, 00:14:04.630 "claim_type": "exclusive_write", 00:14:04.630 "zoned": false, 00:14:04.630 "supported_io_types": { 00:14:04.630 "read": true, 00:14:04.630 "write": true, 00:14:04.630 "unmap": true, 00:14:04.630 "flush": true, 00:14:04.630 "reset": true, 00:14:04.630 "nvme_admin": false, 00:14:04.630 "nvme_io": false, 00:14:04.630 "nvme_io_md": false, 00:14:04.630 "write_zeroes": true, 00:14:04.630 "zcopy": true, 00:14:04.630 "get_zone_info": false, 00:14:04.630 "zone_management": false, 00:14:04.630 "zone_append": false, 00:14:04.630 "compare": false, 00:14:04.630 "compare_and_write": false, 00:14:04.630 "abort": true, 00:14:04.630 "seek_hole": false, 00:14:04.630 "seek_data": false, 00:14:04.630 "copy": true, 00:14:04.630 "nvme_iov_md": false 00:14:04.630 }, 00:14:04.630 "memory_domains": [ 00:14:04.630 { 00:14:04.630 "dma_device_id": "system", 00:14:04.630 "dma_device_type": 1 00:14:04.630 }, 00:14:04.630 { 00:14:04.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.630 "dma_device_type": 2 00:14:04.630 } 00:14:04.630 ], 00:14:04.630 "driver_specific": {} 00:14:04.630 } 00:14:04.630 ] 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.630 17:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.891 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.891 "name": "Existed_Raid", 00:14:04.891 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:04.891 "strip_size_kb": 64, 00:14:04.891 "state": "configuring", 00:14:04.891 "raid_level": "concat", 00:14:04.891 "superblock": true, 00:14:04.891 "num_base_bdevs": 3, 00:14:04.891 "num_base_bdevs_discovered": 2, 00:14:04.891 "num_base_bdevs_operational": 3, 00:14:04.891 "base_bdevs_list": [ 00:14:04.891 { 00:14:04.891 "name": "BaseBdev1", 00:14:04.891 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:04.891 "is_configured": true, 00:14:04.891 "data_offset": 2048, 00:14:04.891 "data_size": 63488 00:14:04.891 }, 00:14:04.891 { 00:14:04.891 "name": null, 00:14:04.891 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:04.891 "is_configured": false, 00:14:04.891 "data_offset": 2048, 00:14:04.891 "data_size": 63488 00:14:04.891 }, 00:14:04.891 { 00:14:04.891 "name": "BaseBdev3", 00:14:04.891 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:04.891 "is_configured": true, 00:14:04.891 "data_offset": 2048, 00:14:04.891 "data_size": 63488 00:14:04.891 } 00:14:04.891 ] 00:14:04.891 }' 00:14:04.891 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.891 17:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:05.462 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.462 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:05.722 [2024-07-15 17:26:16.964743] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.722 17:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.983 17:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.983 "name": "Existed_Raid", 00:14:05.983 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:05.983 "strip_size_kb": 64, 00:14:05.983 "state": "configuring", 00:14:05.983 "raid_level": "concat", 00:14:05.983 "superblock": true, 00:14:05.983 "num_base_bdevs": 3, 00:14:05.983 "num_base_bdevs_discovered": 1, 00:14:05.983 "num_base_bdevs_operational": 3, 00:14:05.983 "base_bdevs_list": [ 00:14:05.983 { 00:14:05.983 "name": "BaseBdev1", 00:14:05.983 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:05.983 "is_configured": true, 00:14:05.983 "data_offset": 2048, 00:14:05.983 "data_size": 63488 00:14:05.983 }, 00:14:05.983 { 00:14:05.983 "name": null, 00:14:05.983 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:05.983 "is_configured": false, 00:14:05.983 "data_offset": 2048, 00:14:05.983 "data_size": 63488 00:14:05.983 }, 00:14:05.983 { 00:14:05.983 "name": null, 00:14:05.983 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:05.983 "is_configured": false, 00:14:05.983 "data_offset": 2048, 00:14:05.983 "data_size": 63488 00:14:05.983 } 00:14:05.983 ] 00:14:05.983 }' 00:14:05.983 17:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.983 17:26:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:06.555 17:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.555 17:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:06.815 17:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:06.815 17:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:06.815 [2024-07-15 17:26:18.099629] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.075 "name": "Existed_Raid", 00:14:07.075 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:07.075 "strip_size_kb": 64, 00:14:07.075 "state": "configuring", 00:14:07.075 "raid_level": "concat", 00:14:07.075 "superblock": true, 00:14:07.075 "num_base_bdevs": 3, 00:14:07.075 "num_base_bdevs_discovered": 2, 00:14:07.075 "num_base_bdevs_operational": 3, 00:14:07.075 "base_bdevs_list": [ 00:14:07.075 { 00:14:07.075 "name": "BaseBdev1", 00:14:07.075 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:07.075 "is_configured": true, 00:14:07.075 "data_offset": 2048, 00:14:07.075 "data_size": 63488 00:14:07.075 }, 00:14:07.075 { 00:14:07.075 "name": null, 00:14:07.075 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:07.075 "is_configured": false, 00:14:07.075 "data_offset": 2048, 00:14:07.075 "data_size": 63488 00:14:07.075 }, 00:14:07.075 { 00:14:07.075 "name": "BaseBdev3", 00:14:07.075 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:07.075 "is_configured": true, 00:14:07.075 "data_offset": 2048, 00:14:07.075 "data_size": 63488 00:14:07.075 } 00:14:07.075 ] 00:14:07.075 }' 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.075 17:26:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:07.645 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.645 17:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:07.905 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:07.905 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:08.165 [2024-07-15 17:26:19.234517] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.165 "name": "Existed_Raid", 00:14:08.165 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:08.165 "strip_size_kb": 64, 00:14:08.165 "state": "configuring", 00:14:08.165 "raid_level": "concat", 00:14:08.165 "superblock": true, 00:14:08.165 "num_base_bdevs": 3, 00:14:08.165 "num_base_bdevs_discovered": 1, 00:14:08.165 "num_base_bdevs_operational": 3, 00:14:08.165 "base_bdevs_list": [ 00:14:08.165 { 00:14:08.165 "name": null, 00:14:08.165 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:08.165 "is_configured": false, 00:14:08.165 "data_offset": 2048, 00:14:08.165 "data_size": 63488 00:14:08.165 }, 00:14:08.165 { 00:14:08.165 "name": null, 00:14:08.165 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:08.165 "is_configured": false, 00:14:08.165 "data_offset": 2048, 00:14:08.165 "data_size": 63488 00:14:08.165 }, 00:14:08.165 { 00:14:08.165 "name": "BaseBdev3", 00:14:08.165 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:08.165 "is_configured": true, 00:14:08.165 "data_offset": 2048, 00:14:08.165 "data_size": 63488 00:14:08.165 } 00:14:08.165 ] 00:14:08.165 }' 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.165 17:26:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.736 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.736 17:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:08.996 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:08.996 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:09.259 [2024-07-15 17:26:20.319390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.259 "name": "Existed_Raid", 00:14:09.259 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:09.259 "strip_size_kb": 64, 00:14:09.259 "state": "configuring", 00:14:09.259 "raid_level": "concat", 00:14:09.259 "superblock": true, 00:14:09.259 "num_base_bdevs": 3, 00:14:09.259 "num_base_bdevs_discovered": 2, 00:14:09.259 "num_base_bdevs_operational": 3, 00:14:09.259 "base_bdevs_list": [ 00:14:09.259 { 00:14:09.259 "name": null, 00:14:09.259 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:09.259 "is_configured": false, 00:14:09.259 "data_offset": 2048, 00:14:09.259 "data_size": 63488 00:14:09.259 }, 00:14:09.259 { 00:14:09.259 "name": "BaseBdev2", 00:14:09.259 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:09.259 "is_configured": true, 00:14:09.259 "data_offset": 2048, 00:14:09.259 "data_size": 63488 00:14:09.259 }, 00:14:09.259 { 00:14:09.259 "name": "BaseBdev3", 00:14:09.259 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:09.259 "is_configured": true, 00:14:09.259 "data_offset": 2048, 00:14:09.259 "data_size": 63488 00:14:09.259 } 00:14:09.259 ] 00:14:09.259 }' 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.259 17:26:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:09.897 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.897 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:10.157 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:10.157 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.157 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:10.157 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a5d6c0aa-401a-4d73-aa7d-4acf579bde9f 00:14:10.417 [2024-07-15 17:26:21.599634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:10.417 [2024-07-15 17:26:21.599749] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18746f0 00:14:10.417 [2024-07-15 17:26:21.599758] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:10.417 [2024-07-15 17:26:21.599896] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16d1440 00:14:10.417 [2024-07-15 17:26:21.599981] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18746f0 00:14:10.417 [2024-07-15 17:26:21.599987] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18746f0 00:14:10.417 [2024-07-15 17:26:21.600054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:10.417 NewBaseBdev 00:14:10.417 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:10.417 17:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:10.417 17:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:10.417 17:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:10.417 17:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:10.417 17:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:10.417 17:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:10.678 17:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:10.938 [ 00:14:10.938 { 00:14:10.938 "name": "NewBaseBdev", 00:14:10.938 "aliases": [ 00:14:10.938 "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f" 00:14:10.938 ], 00:14:10.938 "product_name": "Malloc disk", 00:14:10.938 "block_size": 512, 00:14:10.938 "num_blocks": 65536, 00:14:10.938 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:10.938 "assigned_rate_limits": { 00:14:10.938 "rw_ios_per_sec": 0, 00:14:10.938 "rw_mbytes_per_sec": 0, 00:14:10.938 "r_mbytes_per_sec": 0, 00:14:10.938 "w_mbytes_per_sec": 0 00:14:10.938 }, 00:14:10.938 "claimed": true, 00:14:10.938 "claim_type": "exclusive_write", 00:14:10.938 "zoned": false, 00:14:10.938 "supported_io_types": { 00:14:10.938 "read": true, 00:14:10.938 "write": true, 00:14:10.938 "unmap": true, 00:14:10.938 "flush": true, 00:14:10.938 "reset": true, 00:14:10.938 "nvme_admin": false, 00:14:10.938 "nvme_io": false, 00:14:10.938 "nvme_io_md": false, 00:14:10.938 "write_zeroes": true, 00:14:10.938 "zcopy": true, 00:14:10.938 "get_zone_info": false, 00:14:10.938 "zone_management": false, 00:14:10.938 "zone_append": false, 00:14:10.938 "compare": false, 00:14:10.938 "compare_and_write": false, 00:14:10.938 "abort": true, 00:14:10.938 "seek_hole": false, 00:14:10.938 "seek_data": false, 00:14:10.938 "copy": true, 00:14:10.938 "nvme_iov_md": false 00:14:10.938 }, 00:14:10.938 "memory_domains": [ 00:14:10.938 { 00:14:10.938 "dma_device_id": "system", 00:14:10.938 "dma_device_type": 1 00:14:10.938 }, 00:14:10.938 { 00:14:10.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.938 "dma_device_type": 2 00:14:10.938 } 00:14:10.938 ], 00:14:10.938 "driver_specific": {} 00:14:10.938 } 00:14:10.938 ] 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.938 17:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.938 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.938 "name": "Existed_Raid", 00:14:10.938 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:10.938 "strip_size_kb": 64, 00:14:10.938 "state": "online", 00:14:10.938 "raid_level": "concat", 00:14:10.938 "superblock": true, 00:14:10.938 "num_base_bdevs": 3, 00:14:10.938 "num_base_bdevs_discovered": 3, 00:14:10.938 "num_base_bdevs_operational": 3, 00:14:10.938 "base_bdevs_list": [ 00:14:10.938 { 00:14:10.938 "name": "NewBaseBdev", 00:14:10.938 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:10.938 "is_configured": true, 00:14:10.938 "data_offset": 2048, 00:14:10.938 "data_size": 63488 00:14:10.938 }, 00:14:10.938 { 00:14:10.938 "name": "BaseBdev2", 00:14:10.938 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:10.938 "is_configured": true, 00:14:10.938 "data_offset": 2048, 00:14:10.938 "data_size": 63488 00:14:10.938 }, 00:14:10.938 { 00:14:10.938 "name": "BaseBdev3", 00:14:10.938 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:10.938 "is_configured": true, 00:14:10.938 "data_offset": 2048, 00:14:10.938 "data_size": 63488 00:14:10.938 } 00:14:10.938 ] 00:14:10.938 }' 00:14:10.938 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.938 17:26:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:11.510 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:11.510 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:11.510 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:11.510 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:11.510 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:11.510 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:11.510 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:11.510 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:11.770 [2024-07-15 17:26:22.915203] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:11.770 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:11.770 "name": "Existed_Raid", 00:14:11.770 "aliases": [ 00:14:11.770 "a30f2d13-32c1-4897-b4cc-de594e46dd1c" 00:14:11.770 ], 00:14:11.770 "product_name": "Raid Volume", 00:14:11.770 "block_size": 512, 00:14:11.770 "num_blocks": 190464, 00:14:11.770 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:11.770 "assigned_rate_limits": { 00:14:11.770 "rw_ios_per_sec": 0, 00:14:11.770 "rw_mbytes_per_sec": 0, 00:14:11.770 "r_mbytes_per_sec": 0, 00:14:11.770 "w_mbytes_per_sec": 0 00:14:11.770 }, 00:14:11.770 "claimed": false, 00:14:11.770 "zoned": false, 00:14:11.770 "supported_io_types": { 00:14:11.770 "read": true, 00:14:11.770 "write": true, 00:14:11.770 "unmap": true, 00:14:11.770 "flush": true, 00:14:11.770 "reset": true, 00:14:11.770 "nvme_admin": false, 00:14:11.770 "nvme_io": false, 00:14:11.770 "nvme_io_md": false, 00:14:11.770 "write_zeroes": true, 00:14:11.770 "zcopy": false, 00:14:11.770 "get_zone_info": false, 00:14:11.770 "zone_management": false, 00:14:11.770 "zone_append": false, 00:14:11.770 "compare": false, 00:14:11.770 "compare_and_write": false, 00:14:11.770 "abort": false, 00:14:11.770 "seek_hole": false, 00:14:11.770 "seek_data": false, 00:14:11.770 "copy": false, 00:14:11.770 "nvme_iov_md": false 00:14:11.770 }, 00:14:11.770 "memory_domains": [ 00:14:11.770 { 00:14:11.770 "dma_device_id": "system", 00:14:11.770 "dma_device_type": 1 00:14:11.770 }, 00:14:11.770 { 00:14:11.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.770 "dma_device_type": 2 00:14:11.770 }, 00:14:11.770 { 00:14:11.770 "dma_device_id": "system", 00:14:11.770 "dma_device_type": 1 00:14:11.770 }, 00:14:11.770 { 00:14:11.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.770 "dma_device_type": 2 00:14:11.770 }, 00:14:11.770 { 00:14:11.770 "dma_device_id": "system", 00:14:11.770 "dma_device_type": 1 00:14:11.770 }, 00:14:11.770 { 00:14:11.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.770 "dma_device_type": 2 00:14:11.770 } 00:14:11.770 ], 00:14:11.770 "driver_specific": { 00:14:11.770 "raid": { 00:14:11.770 "uuid": "a30f2d13-32c1-4897-b4cc-de594e46dd1c", 00:14:11.770 "strip_size_kb": 64, 00:14:11.770 "state": "online", 00:14:11.770 "raid_level": "concat", 00:14:11.770 "superblock": true, 00:14:11.770 "num_base_bdevs": 3, 00:14:11.770 "num_base_bdevs_discovered": 3, 00:14:11.770 "num_base_bdevs_operational": 3, 00:14:11.770 "base_bdevs_list": [ 00:14:11.770 { 00:14:11.770 "name": "NewBaseBdev", 00:14:11.770 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:11.770 "is_configured": true, 00:14:11.770 "data_offset": 2048, 00:14:11.770 "data_size": 63488 00:14:11.770 }, 00:14:11.771 { 00:14:11.771 "name": "BaseBdev2", 00:14:11.771 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:11.771 "is_configured": true, 00:14:11.771 "data_offset": 2048, 00:14:11.771 "data_size": 63488 00:14:11.771 }, 00:14:11.771 { 00:14:11.771 "name": "BaseBdev3", 00:14:11.771 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:11.771 "is_configured": true, 00:14:11.771 "data_offset": 2048, 00:14:11.771 "data_size": 63488 00:14:11.771 } 00:14:11.771 ] 00:14:11.771 } 00:14:11.771 } 00:14:11.771 }' 00:14:11.771 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:11.771 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:11.771 BaseBdev2 00:14:11.771 BaseBdev3' 00:14:11.771 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:11.771 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:11.771 17:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.037 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.037 "name": "NewBaseBdev", 00:14:12.037 "aliases": [ 00:14:12.037 "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f" 00:14:12.037 ], 00:14:12.037 "product_name": "Malloc disk", 00:14:12.037 "block_size": 512, 00:14:12.037 "num_blocks": 65536, 00:14:12.037 "uuid": "a5d6c0aa-401a-4d73-aa7d-4acf579bde9f", 00:14:12.037 "assigned_rate_limits": { 00:14:12.037 "rw_ios_per_sec": 0, 00:14:12.037 "rw_mbytes_per_sec": 0, 00:14:12.037 "r_mbytes_per_sec": 0, 00:14:12.037 "w_mbytes_per_sec": 0 00:14:12.037 }, 00:14:12.037 "claimed": true, 00:14:12.037 "claim_type": "exclusive_write", 00:14:12.037 "zoned": false, 00:14:12.037 "supported_io_types": { 00:14:12.037 "read": true, 00:14:12.037 "write": true, 00:14:12.037 "unmap": true, 00:14:12.037 "flush": true, 00:14:12.037 "reset": true, 00:14:12.037 "nvme_admin": false, 00:14:12.037 "nvme_io": false, 00:14:12.037 "nvme_io_md": false, 00:14:12.037 "write_zeroes": true, 00:14:12.037 "zcopy": true, 00:14:12.037 "get_zone_info": false, 00:14:12.037 "zone_management": false, 00:14:12.037 "zone_append": false, 00:14:12.037 "compare": false, 00:14:12.037 "compare_and_write": false, 00:14:12.037 "abort": true, 00:14:12.037 "seek_hole": false, 00:14:12.037 "seek_data": false, 00:14:12.037 "copy": true, 00:14:12.037 "nvme_iov_md": false 00:14:12.037 }, 00:14:12.037 "memory_domains": [ 00:14:12.037 { 00:14:12.037 "dma_device_id": "system", 00:14:12.037 "dma_device_type": 1 00:14:12.037 }, 00:14:12.037 { 00:14:12.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.037 "dma_device_type": 2 00:14:12.037 } 00:14:12.037 ], 00:14:12.037 "driver_specific": {} 00:14:12.037 }' 00:14:12.037 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.037 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.037 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.037 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.037 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:12.320 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.579 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.579 "name": "BaseBdev2", 00:14:12.579 "aliases": [ 00:14:12.579 "13e7abde-f48c-4467-8e27-a0f89a69990d" 00:14:12.579 ], 00:14:12.579 "product_name": "Malloc disk", 00:14:12.579 "block_size": 512, 00:14:12.579 "num_blocks": 65536, 00:14:12.579 "uuid": "13e7abde-f48c-4467-8e27-a0f89a69990d", 00:14:12.579 "assigned_rate_limits": { 00:14:12.580 "rw_ios_per_sec": 0, 00:14:12.580 "rw_mbytes_per_sec": 0, 00:14:12.580 "r_mbytes_per_sec": 0, 00:14:12.580 "w_mbytes_per_sec": 0 00:14:12.580 }, 00:14:12.580 "claimed": true, 00:14:12.580 "claim_type": "exclusive_write", 00:14:12.580 "zoned": false, 00:14:12.580 "supported_io_types": { 00:14:12.580 "read": true, 00:14:12.580 "write": true, 00:14:12.580 "unmap": true, 00:14:12.580 "flush": true, 00:14:12.580 "reset": true, 00:14:12.580 "nvme_admin": false, 00:14:12.580 "nvme_io": false, 00:14:12.580 "nvme_io_md": false, 00:14:12.580 "write_zeroes": true, 00:14:12.580 "zcopy": true, 00:14:12.580 "get_zone_info": false, 00:14:12.580 "zone_management": false, 00:14:12.580 "zone_append": false, 00:14:12.580 "compare": false, 00:14:12.580 "compare_and_write": false, 00:14:12.580 "abort": true, 00:14:12.580 "seek_hole": false, 00:14:12.580 "seek_data": false, 00:14:12.580 "copy": true, 00:14:12.580 "nvme_iov_md": false 00:14:12.580 }, 00:14:12.580 "memory_domains": [ 00:14:12.580 { 00:14:12.580 "dma_device_id": "system", 00:14:12.580 "dma_device_type": 1 00:14:12.580 }, 00:14:12.580 { 00:14:12.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.580 "dma_device_type": 2 00:14:12.580 } 00:14:12.580 ], 00:14:12.580 "driver_specific": {} 00:14:12.580 }' 00:14:12.580 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.580 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.580 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.580 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.580 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.840 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.840 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.840 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.840 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.840 17:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.840 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.840 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.840 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.840 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:12.840 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.100 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.100 "name": "BaseBdev3", 00:14:13.100 "aliases": [ 00:14:13.100 "14cf507c-9c3e-4301-94be-68555cf84106" 00:14:13.100 ], 00:14:13.100 "product_name": "Malloc disk", 00:14:13.100 "block_size": 512, 00:14:13.100 "num_blocks": 65536, 00:14:13.100 "uuid": "14cf507c-9c3e-4301-94be-68555cf84106", 00:14:13.100 "assigned_rate_limits": { 00:14:13.100 "rw_ios_per_sec": 0, 00:14:13.100 "rw_mbytes_per_sec": 0, 00:14:13.100 "r_mbytes_per_sec": 0, 00:14:13.100 "w_mbytes_per_sec": 0 00:14:13.100 }, 00:14:13.100 "claimed": true, 00:14:13.100 "claim_type": "exclusive_write", 00:14:13.100 "zoned": false, 00:14:13.100 "supported_io_types": { 00:14:13.100 "read": true, 00:14:13.100 "write": true, 00:14:13.100 "unmap": true, 00:14:13.100 "flush": true, 00:14:13.100 "reset": true, 00:14:13.100 "nvme_admin": false, 00:14:13.100 "nvme_io": false, 00:14:13.100 "nvme_io_md": false, 00:14:13.100 "write_zeroes": true, 00:14:13.100 "zcopy": true, 00:14:13.100 "get_zone_info": false, 00:14:13.100 "zone_management": false, 00:14:13.100 "zone_append": false, 00:14:13.100 "compare": false, 00:14:13.100 "compare_and_write": false, 00:14:13.100 "abort": true, 00:14:13.100 "seek_hole": false, 00:14:13.100 "seek_data": false, 00:14:13.100 "copy": true, 00:14:13.100 "nvme_iov_md": false 00:14:13.100 }, 00:14:13.100 "memory_domains": [ 00:14:13.100 { 00:14:13.100 "dma_device_id": "system", 00:14:13.100 "dma_device_type": 1 00:14:13.100 }, 00:14:13.100 { 00:14:13.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.100 "dma_device_type": 2 00:14:13.100 } 00:14:13.100 ], 00:14:13.100 "driver_specific": {} 00:14:13.100 }' 00:14:13.100 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.100 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.100 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.100 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.100 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.360 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.360 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.360 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.360 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.360 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.360 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.360 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.360 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:13.621 [2024-07-15 17:26:24.747620] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:13.621 [2024-07-15 17:26:24.747636] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:13.621 [2024-07-15 17:26:24.747668] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:13.621 [2024-07-15 17:26:24.747704] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:13.621 [2024-07-15 17:26:24.747719] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18746f0 name Existed_Raid, state offline 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2781734 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2781734 ']' 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2781734 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2781734 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2781734' 00:14:13.621 killing process with pid 2781734 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2781734 00:14:13.621 [2024-07-15 17:26:24.816363] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:13.621 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2781734 00:14:13.621 [2024-07-15 17:26:24.830971] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:13.882 17:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:13.882 00:14:13.882 real 0m23.970s 00:14:13.882 user 0m44.955s 00:14:13.882 sys 0m3.555s 00:14:13.882 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:13.882 17:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:13.882 ************************************ 00:14:13.882 END TEST raid_state_function_test_sb 00:14:13.882 ************************************ 00:14:13.882 17:26:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:13.882 17:26:24 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:14:13.882 17:26:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:13.882 17:26:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:13.883 17:26:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:13.883 ************************************ 00:14:13.883 START TEST raid_superblock_test 00:14:13.883 ************************************ 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2786210 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2786210 /var/tmp/spdk-raid.sock 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2786210 ']' 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:13.883 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:13.883 17:26:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.883 [2024-07-15 17:26:25.093081] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:14:13.883 [2024-07-15 17:26:25.093142] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2786210 ] 00:14:14.144 [2024-07-15 17:26:25.180935] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:14.144 [2024-07-15 17:26:25.248441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:14.144 [2024-07-15 17:26:25.293720] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:14.144 [2024-07-15 17:26:25.293742] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:14.713 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:14.714 17:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:14.974 malloc1 00:14:14.974 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:15.235 [2024-07-15 17:26:26.288442] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:15.235 [2024-07-15 17:26:26.288476] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.235 [2024-07-15 17:26:26.288487] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1caea20 00:14:15.235 [2024-07-15 17:26:26.288493] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.235 [2024-07-15 17:26:26.289800] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.235 [2024-07-15 17:26:26.289820] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:15.235 pt1 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:15.235 malloc2 00:14:15.235 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:15.495 [2024-07-15 17:26:26.659523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:15.495 [2024-07-15 17:26:26.659550] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.495 [2024-07-15 17:26:26.659560] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1caf040 00:14:15.495 [2024-07-15 17:26:26.659566] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.495 [2024-07-15 17:26:26.660740] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.495 [2024-07-15 17:26:26.660757] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:15.495 pt2 00:14:15.495 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:15.495 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:15.495 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:15.495 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:15.495 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:15.495 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:15.495 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:15.495 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:15.495 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:15.756 malloc3 00:14:15.756 17:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:15.756 [2024-07-15 17:26:27.046402] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:15.756 [2024-07-15 17:26:27.046429] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.756 [2024-07-15 17:26:27.046442] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1caf540 00:14:15.756 [2024-07-15 17:26:27.046449] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.756 [2024-07-15 17:26:27.047640] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.756 [2024-07-15 17:26:27.047659] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:15.756 pt3 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:16.017 [2024-07-15 17:26:27.222859] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:16.017 [2024-07-15 17:26:27.223853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:16.017 [2024-07-15 17:26:27.223892] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:16.017 [2024-07-15 17:26:27.224006] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e5ba90 00:14:16.017 [2024-07-15 17:26:27.224013] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:16.017 [2024-07-15 17:26:27.224159] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e57c50 00:14:16.017 [2024-07-15 17:26:27.224266] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e5ba90 00:14:16.017 [2024-07-15 17:26:27.224271] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e5ba90 00:14:16.017 [2024-07-15 17:26:27.224338] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.017 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:16.277 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.277 "name": "raid_bdev1", 00:14:16.277 "uuid": "8363a717-86a8-4e2a-b89f-e08c4265269a", 00:14:16.277 "strip_size_kb": 64, 00:14:16.277 "state": "online", 00:14:16.277 "raid_level": "concat", 00:14:16.277 "superblock": true, 00:14:16.277 "num_base_bdevs": 3, 00:14:16.277 "num_base_bdevs_discovered": 3, 00:14:16.277 "num_base_bdevs_operational": 3, 00:14:16.277 "base_bdevs_list": [ 00:14:16.277 { 00:14:16.277 "name": "pt1", 00:14:16.277 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:16.277 "is_configured": true, 00:14:16.277 "data_offset": 2048, 00:14:16.277 "data_size": 63488 00:14:16.277 }, 00:14:16.277 { 00:14:16.277 "name": "pt2", 00:14:16.277 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:16.277 "is_configured": true, 00:14:16.277 "data_offset": 2048, 00:14:16.277 "data_size": 63488 00:14:16.277 }, 00:14:16.277 { 00:14:16.277 "name": "pt3", 00:14:16.277 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:16.277 "is_configured": true, 00:14:16.277 "data_offset": 2048, 00:14:16.277 "data_size": 63488 00:14:16.277 } 00:14:16.277 ] 00:14:16.277 }' 00:14:16.277 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.277 17:26:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:16.848 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:16.848 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:16.848 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:16.848 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:16.848 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:16.848 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:16.848 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:16.848 17:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:17.110 [2024-07-15 17:26:28.169444] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:17.110 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:17.110 "name": "raid_bdev1", 00:14:17.110 "aliases": [ 00:14:17.110 "8363a717-86a8-4e2a-b89f-e08c4265269a" 00:14:17.110 ], 00:14:17.110 "product_name": "Raid Volume", 00:14:17.110 "block_size": 512, 00:14:17.110 "num_blocks": 190464, 00:14:17.110 "uuid": "8363a717-86a8-4e2a-b89f-e08c4265269a", 00:14:17.110 "assigned_rate_limits": { 00:14:17.110 "rw_ios_per_sec": 0, 00:14:17.110 "rw_mbytes_per_sec": 0, 00:14:17.110 "r_mbytes_per_sec": 0, 00:14:17.110 "w_mbytes_per_sec": 0 00:14:17.110 }, 00:14:17.110 "claimed": false, 00:14:17.110 "zoned": false, 00:14:17.110 "supported_io_types": { 00:14:17.110 "read": true, 00:14:17.110 "write": true, 00:14:17.110 "unmap": true, 00:14:17.110 "flush": true, 00:14:17.110 "reset": true, 00:14:17.110 "nvme_admin": false, 00:14:17.110 "nvme_io": false, 00:14:17.110 "nvme_io_md": false, 00:14:17.110 "write_zeroes": true, 00:14:17.110 "zcopy": false, 00:14:17.110 "get_zone_info": false, 00:14:17.110 "zone_management": false, 00:14:17.110 "zone_append": false, 00:14:17.110 "compare": false, 00:14:17.110 "compare_and_write": false, 00:14:17.110 "abort": false, 00:14:17.110 "seek_hole": false, 00:14:17.110 "seek_data": false, 00:14:17.110 "copy": false, 00:14:17.110 "nvme_iov_md": false 00:14:17.110 }, 00:14:17.110 "memory_domains": [ 00:14:17.110 { 00:14:17.110 "dma_device_id": "system", 00:14:17.110 "dma_device_type": 1 00:14:17.110 }, 00:14:17.110 { 00:14:17.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.110 "dma_device_type": 2 00:14:17.110 }, 00:14:17.110 { 00:14:17.110 "dma_device_id": "system", 00:14:17.110 "dma_device_type": 1 00:14:17.110 }, 00:14:17.110 { 00:14:17.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.110 "dma_device_type": 2 00:14:17.110 }, 00:14:17.110 { 00:14:17.110 "dma_device_id": "system", 00:14:17.110 "dma_device_type": 1 00:14:17.110 }, 00:14:17.110 { 00:14:17.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.110 "dma_device_type": 2 00:14:17.110 } 00:14:17.110 ], 00:14:17.110 "driver_specific": { 00:14:17.110 "raid": { 00:14:17.110 "uuid": "8363a717-86a8-4e2a-b89f-e08c4265269a", 00:14:17.110 "strip_size_kb": 64, 00:14:17.110 "state": "online", 00:14:17.110 "raid_level": "concat", 00:14:17.110 "superblock": true, 00:14:17.110 "num_base_bdevs": 3, 00:14:17.110 "num_base_bdevs_discovered": 3, 00:14:17.110 "num_base_bdevs_operational": 3, 00:14:17.110 "base_bdevs_list": [ 00:14:17.110 { 00:14:17.110 "name": "pt1", 00:14:17.110 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:17.110 "is_configured": true, 00:14:17.110 "data_offset": 2048, 00:14:17.110 "data_size": 63488 00:14:17.110 }, 00:14:17.110 { 00:14:17.110 "name": "pt2", 00:14:17.110 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:17.110 "is_configured": true, 00:14:17.110 "data_offset": 2048, 00:14:17.110 "data_size": 63488 00:14:17.110 }, 00:14:17.110 { 00:14:17.110 "name": "pt3", 00:14:17.110 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:17.110 "is_configured": true, 00:14:17.110 "data_offset": 2048, 00:14:17.110 "data_size": 63488 00:14:17.110 } 00:14:17.110 ] 00:14:17.110 } 00:14:17.110 } 00:14:17.110 }' 00:14:17.110 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:17.110 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:17.110 pt2 00:14:17.110 pt3' 00:14:17.110 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:17.110 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:17.110 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:17.370 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.370 "name": "pt1", 00:14:17.370 "aliases": [ 00:14:17.370 "00000000-0000-0000-0000-000000000001" 00:14:17.370 ], 00:14:17.370 "product_name": "passthru", 00:14:17.370 "block_size": 512, 00:14:17.370 "num_blocks": 65536, 00:14:17.370 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:17.370 "assigned_rate_limits": { 00:14:17.370 "rw_ios_per_sec": 0, 00:14:17.370 "rw_mbytes_per_sec": 0, 00:14:17.370 "r_mbytes_per_sec": 0, 00:14:17.370 "w_mbytes_per_sec": 0 00:14:17.370 }, 00:14:17.370 "claimed": true, 00:14:17.370 "claim_type": "exclusive_write", 00:14:17.370 "zoned": false, 00:14:17.370 "supported_io_types": { 00:14:17.370 "read": true, 00:14:17.370 "write": true, 00:14:17.370 "unmap": true, 00:14:17.370 "flush": true, 00:14:17.370 "reset": true, 00:14:17.370 "nvme_admin": false, 00:14:17.370 "nvme_io": false, 00:14:17.370 "nvme_io_md": false, 00:14:17.370 "write_zeroes": true, 00:14:17.371 "zcopy": true, 00:14:17.371 "get_zone_info": false, 00:14:17.371 "zone_management": false, 00:14:17.371 "zone_append": false, 00:14:17.371 "compare": false, 00:14:17.371 "compare_and_write": false, 00:14:17.371 "abort": true, 00:14:17.371 "seek_hole": false, 00:14:17.371 "seek_data": false, 00:14:17.371 "copy": true, 00:14:17.371 "nvme_iov_md": false 00:14:17.371 }, 00:14:17.371 "memory_domains": [ 00:14:17.371 { 00:14:17.371 "dma_device_id": "system", 00:14:17.371 "dma_device_type": 1 00:14:17.371 }, 00:14:17.371 { 00:14:17.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.371 "dma_device_type": 2 00:14:17.371 } 00:14:17.371 ], 00:14:17.371 "driver_specific": { 00:14:17.371 "passthru": { 00:14:17.371 "name": "pt1", 00:14:17.371 "base_bdev_name": "malloc1" 00:14:17.371 } 00:14:17.371 } 00:14:17.371 }' 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:17.371 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.631 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.631 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:17.631 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:17.631 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:17.631 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:17.904 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.904 "name": "pt2", 00:14:17.904 "aliases": [ 00:14:17.904 "00000000-0000-0000-0000-000000000002" 00:14:17.904 ], 00:14:17.904 "product_name": "passthru", 00:14:17.904 "block_size": 512, 00:14:17.904 "num_blocks": 65536, 00:14:17.904 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:17.904 "assigned_rate_limits": { 00:14:17.904 "rw_ios_per_sec": 0, 00:14:17.904 "rw_mbytes_per_sec": 0, 00:14:17.904 "r_mbytes_per_sec": 0, 00:14:17.904 "w_mbytes_per_sec": 0 00:14:17.904 }, 00:14:17.904 "claimed": true, 00:14:17.904 "claim_type": "exclusive_write", 00:14:17.904 "zoned": false, 00:14:17.904 "supported_io_types": { 00:14:17.904 "read": true, 00:14:17.904 "write": true, 00:14:17.904 "unmap": true, 00:14:17.904 "flush": true, 00:14:17.904 "reset": true, 00:14:17.904 "nvme_admin": false, 00:14:17.904 "nvme_io": false, 00:14:17.904 "nvme_io_md": false, 00:14:17.904 "write_zeroes": true, 00:14:17.904 "zcopy": true, 00:14:17.904 "get_zone_info": false, 00:14:17.904 "zone_management": false, 00:14:17.904 "zone_append": false, 00:14:17.904 "compare": false, 00:14:17.904 "compare_and_write": false, 00:14:17.904 "abort": true, 00:14:17.904 "seek_hole": false, 00:14:17.904 "seek_data": false, 00:14:17.904 "copy": true, 00:14:17.904 "nvme_iov_md": false 00:14:17.904 }, 00:14:17.904 "memory_domains": [ 00:14:17.904 { 00:14:17.904 "dma_device_id": "system", 00:14:17.904 "dma_device_type": 1 00:14:17.904 }, 00:14:17.904 { 00:14:17.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.904 "dma_device_type": 2 00:14:17.904 } 00:14:17.904 ], 00:14:17.904 "driver_specific": { 00:14:17.904 "passthru": { 00:14:17.904 "name": "pt2", 00:14:17.904 "base_bdev_name": "malloc2" 00:14:17.904 } 00:14:17.904 } 00:14:17.904 }' 00:14:17.904 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.904 17:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.904 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.904 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.904 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.904 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:17.904 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.904 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.164 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.164 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.164 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.164 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.164 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.164 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:18.164 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.424 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.424 "name": "pt3", 00:14:18.424 "aliases": [ 00:14:18.424 "00000000-0000-0000-0000-000000000003" 00:14:18.424 ], 00:14:18.424 "product_name": "passthru", 00:14:18.424 "block_size": 512, 00:14:18.424 "num_blocks": 65536, 00:14:18.424 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:18.424 "assigned_rate_limits": { 00:14:18.424 "rw_ios_per_sec": 0, 00:14:18.424 "rw_mbytes_per_sec": 0, 00:14:18.424 "r_mbytes_per_sec": 0, 00:14:18.424 "w_mbytes_per_sec": 0 00:14:18.424 }, 00:14:18.424 "claimed": true, 00:14:18.424 "claim_type": "exclusive_write", 00:14:18.424 "zoned": false, 00:14:18.424 "supported_io_types": { 00:14:18.424 "read": true, 00:14:18.424 "write": true, 00:14:18.424 "unmap": true, 00:14:18.424 "flush": true, 00:14:18.424 "reset": true, 00:14:18.424 "nvme_admin": false, 00:14:18.424 "nvme_io": false, 00:14:18.424 "nvme_io_md": false, 00:14:18.424 "write_zeroes": true, 00:14:18.424 "zcopy": true, 00:14:18.424 "get_zone_info": false, 00:14:18.424 "zone_management": false, 00:14:18.424 "zone_append": false, 00:14:18.424 "compare": false, 00:14:18.424 "compare_and_write": false, 00:14:18.424 "abort": true, 00:14:18.424 "seek_hole": false, 00:14:18.424 "seek_data": false, 00:14:18.424 "copy": true, 00:14:18.424 "nvme_iov_md": false 00:14:18.424 }, 00:14:18.424 "memory_domains": [ 00:14:18.424 { 00:14:18.424 "dma_device_id": "system", 00:14:18.424 "dma_device_type": 1 00:14:18.424 }, 00:14:18.424 { 00:14:18.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.424 "dma_device_type": 2 00:14:18.424 } 00:14:18.424 ], 00:14:18.424 "driver_specific": { 00:14:18.424 "passthru": { 00:14:18.424 "name": "pt3", 00:14:18.424 "base_bdev_name": "malloc3" 00:14:18.424 } 00:14:18.424 } 00:14:18.424 }' 00:14:18.424 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.424 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.424 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.424 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.424 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.424 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.424 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.684 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.684 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.684 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.684 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.684 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.684 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:18.684 17:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:18.944 [2024-07-15 17:26:30.022134] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:18.944 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8363a717-86a8-4e2a-b89f-e08c4265269a 00:14:18.944 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8363a717-86a8-4e2a-b89f-e08c4265269a ']' 00:14:18.944 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:18.944 [2024-07-15 17:26:30.214402] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:18.944 [2024-07-15 17:26:30.214416] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:18.944 [2024-07-15 17:26:30.214449] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:18.944 [2024-07-15 17:26:30.214487] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:18.944 [2024-07-15 17:26:30.214493] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e5ba90 name raid_bdev1, state offline 00:14:18.944 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.944 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:19.203 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:19.203 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:19.203 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:19.203 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:19.463 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:19.463 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:19.722 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:19.722 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:19.722 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:19.722 17:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:19.981 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:20.239 [2024-07-15 17:26:31.373289] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:20.239 [2024-07-15 17:26:31.374350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:20.239 [2024-07-15 17:26:31.374384] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:20.240 [2024-07-15 17:26:31.374418] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:20.240 [2024-07-15 17:26:31.374444] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:20.240 [2024-07-15 17:26:31.374458] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:20.240 [2024-07-15 17:26:31.374468] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:20.240 [2024-07-15 17:26:31.374473] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e57bf0 name raid_bdev1, state configuring 00:14:20.240 request: 00:14:20.240 { 00:14:20.240 "name": "raid_bdev1", 00:14:20.240 "raid_level": "concat", 00:14:20.240 "base_bdevs": [ 00:14:20.240 "malloc1", 00:14:20.240 "malloc2", 00:14:20.240 "malloc3" 00:14:20.240 ], 00:14:20.240 "strip_size_kb": 64, 00:14:20.240 "superblock": false, 00:14:20.240 "method": "bdev_raid_create", 00:14:20.240 "req_id": 1 00:14:20.240 } 00:14:20.240 Got JSON-RPC error response 00:14:20.240 response: 00:14:20.240 { 00:14:20.240 "code": -17, 00:14:20.240 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:20.240 } 00:14:20.240 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:20.240 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:20.240 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:20.240 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:20.240 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.240 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:20.499 [2024-07-15 17:26:31.758208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:20.499 [2024-07-15 17:26:31.758228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:20.499 [2024-07-15 17:26:31.758238] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cafe00 00:14:20.499 [2024-07-15 17:26:31.758244] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:20.499 [2024-07-15 17:26:31.759486] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:20.499 [2024-07-15 17:26:31.759505] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:20.499 [2024-07-15 17:26:31.759547] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:20.499 [2024-07-15 17:26:31.759565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:20.499 pt1 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.499 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:20.759 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.759 "name": "raid_bdev1", 00:14:20.759 "uuid": "8363a717-86a8-4e2a-b89f-e08c4265269a", 00:14:20.759 "strip_size_kb": 64, 00:14:20.759 "state": "configuring", 00:14:20.759 "raid_level": "concat", 00:14:20.759 "superblock": true, 00:14:20.759 "num_base_bdevs": 3, 00:14:20.759 "num_base_bdevs_discovered": 1, 00:14:20.759 "num_base_bdevs_operational": 3, 00:14:20.759 "base_bdevs_list": [ 00:14:20.759 { 00:14:20.759 "name": "pt1", 00:14:20.759 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.759 "is_configured": true, 00:14:20.759 "data_offset": 2048, 00:14:20.759 "data_size": 63488 00:14:20.759 }, 00:14:20.759 { 00:14:20.759 "name": null, 00:14:20.759 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:20.759 "is_configured": false, 00:14:20.759 "data_offset": 2048, 00:14:20.759 "data_size": 63488 00:14:20.759 }, 00:14:20.759 { 00:14:20.759 "name": null, 00:14:20.759 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:20.759 "is_configured": false, 00:14:20.759 "data_offset": 2048, 00:14:20.759 "data_size": 63488 00:14:20.759 } 00:14:20.759 ] 00:14:20.759 }' 00:14:20.759 17:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.759 17:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.327 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:21.327 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:21.588 [2024-07-15 17:26:32.676537] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:21.588 [2024-07-15 17:26:32.676569] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:21.588 [2024-07-15 17:26:32.676579] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e5df60 00:14:21.588 [2024-07-15 17:26:32.676586] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:21.588 [2024-07-15 17:26:32.676851] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:21.588 [2024-07-15 17:26:32.676861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:21.588 [2024-07-15 17:26:32.676902] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:21.588 [2024-07-15 17:26:32.676914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:21.588 pt2 00:14:21.588 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:21.588 [2024-07-15 17:26:32.873039] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:21.849 17:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.849 17:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.849 "name": "raid_bdev1", 00:14:21.849 "uuid": "8363a717-86a8-4e2a-b89f-e08c4265269a", 00:14:21.849 "strip_size_kb": 64, 00:14:21.849 "state": "configuring", 00:14:21.849 "raid_level": "concat", 00:14:21.849 "superblock": true, 00:14:21.849 "num_base_bdevs": 3, 00:14:21.849 "num_base_bdevs_discovered": 1, 00:14:21.849 "num_base_bdevs_operational": 3, 00:14:21.849 "base_bdevs_list": [ 00:14:21.849 { 00:14:21.849 "name": "pt1", 00:14:21.849 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:21.849 "is_configured": true, 00:14:21.849 "data_offset": 2048, 00:14:21.849 "data_size": 63488 00:14:21.849 }, 00:14:21.849 { 00:14:21.849 "name": null, 00:14:21.849 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:21.849 "is_configured": false, 00:14:21.849 "data_offset": 2048, 00:14:21.849 "data_size": 63488 00:14:21.849 }, 00:14:21.849 { 00:14:21.849 "name": null, 00:14:21.849 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:21.849 "is_configured": false, 00:14:21.849 "data_offset": 2048, 00:14:21.849 "data_size": 63488 00:14:21.849 } 00:14:21.849 ] 00:14:21.849 }' 00:14:21.849 17:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.849 17:26:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.425 17:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:22.425 17:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:22.425 17:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:22.684 [2024-07-15 17:26:33.819427] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:22.684 [2024-07-15 17:26:33.819455] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:22.684 [2024-07-15 17:26:33.819465] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e58aa0 00:14:22.684 [2024-07-15 17:26:33.819471] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:22.684 [2024-07-15 17:26:33.819732] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:22.684 [2024-07-15 17:26:33.819743] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:22.684 [2024-07-15 17:26:33.819783] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:22.684 [2024-07-15 17:26:33.819795] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:22.684 pt2 00:14:22.684 17:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:22.684 17:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:22.684 17:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:22.944 [2024-07-15 17:26:34.015926] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:22.944 [2024-07-15 17:26:34.015945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:22.944 [2024-07-15 17:26:34.015953] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e60410 00:14:22.944 [2024-07-15 17:26:34.015958] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:22.944 [2024-07-15 17:26:34.016171] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:22.944 [2024-07-15 17:26:34.016181] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:22.944 [2024-07-15 17:26:34.016218] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:22.944 [2024-07-15 17:26:34.016228] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:22.944 [2024-07-15 17:26:34.016309] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e58dc0 00:14:22.944 [2024-07-15 17:26:34.016314] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:22.944 [2024-07-15 17:26:34.016451] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e60230 00:14:22.944 [2024-07-15 17:26:34.016546] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e58dc0 00:14:22.944 [2024-07-15 17:26:34.016551] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e58dc0 00:14:22.944 [2024-07-15 17:26:34.016622] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:22.944 pt3 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:22.944 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.944 "name": "raid_bdev1", 00:14:22.944 "uuid": "8363a717-86a8-4e2a-b89f-e08c4265269a", 00:14:22.944 "strip_size_kb": 64, 00:14:22.944 "state": "online", 00:14:22.944 "raid_level": "concat", 00:14:22.944 "superblock": true, 00:14:22.944 "num_base_bdevs": 3, 00:14:22.944 "num_base_bdevs_discovered": 3, 00:14:22.944 "num_base_bdevs_operational": 3, 00:14:22.944 "base_bdevs_list": [ 00:14:22.944 { 00:14:22.944 "name": "pt1", 00:14:22.944 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:22.944 "is_configured": true, 00:14:22.944 "data_offset": 2048, 00:14:22.944 "data_size": 63488 00:14:22.944 }, 00:14:22.944 { 00:14:22.944 "name": "pt2", 00:14:22.944 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:22.944 "is_configured": true, 00:14:22.945 "data_offset": 2048, 00:14:22.945 "data_size": 63488 00:14:22.945 }, 00:14:22.945 { 00:14:22.945 "name": "pt3", 00:14:22.945 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:22.945 "is_configured": true, 00:14:22.945 "data_offset": 2048, 00:14:22.945 "data_size": 63488 00:14:22.945 } 00:14:22.945 ] 00:14:22.945 }' 00:14:22.945 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.945 17:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.556 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:23.556 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:23.556 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:23.556 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:23.556 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:23.556 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:23.556 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:23.556 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:23.816 [2024-07-15 17:26:34.978689] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:23.816 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:23.816 "name": "raid_bdev1", 00:14:23.816 "aliases": [ 00:14:23.816 "8363a717-86a8-4e2a-b89f-e08c4265269a" 00:14:23.816 ], 00:14:23.816 "product_name": "Raid Volume", 00:14:23.816 "block_size": 512, 00:14:23.816 "num_blocks": 190464, 00:14:23.816 "uuid": "8363a717-86a8-4e2a-b89f-e08c4265269a", 00:14:23.816 "assigned_rate_limits": { 00:14:23.816 "rw_ios_per_sec": 0, 00:14:23.816 "rw_mbytes_per_sec": 0, 00:14:23.816 "r_mbytes_per_sec": 0, 00:14:23.816 "w_mbytes_per_sec": 0 00:14:23.816 }, 00:14:23.816 "claimed": false, 00:14:23.816 "zoned": false, 00:14:23.816 "supported_io_types": { 00:14:23.816 "read": true, 00:14:23.816 "write": true, 00:14:23.816 "unmap": true, 00:14:23.816 "flush": true, 00:14:23.816 "reset": true, 00:14:23.816 "nvme_admin": false, 00:14:23.816 "nvme_io": false, 00:14:23.816 "nvme_io_md": false, 00:14:23.816 "write_zeroes": true, 00:14:23.816 "zcopy": false, 00:14:23.816 "get_zone_info": false, 00:14:23.816 "zone_management": false, 00:14:23.816 "zone_append": false, 00:14:23.816 "compare": false, 00:14:23.816 "compare_and_write": false, 00:14:23.816 "abort": false, 00:14:23.816 "seek_hole": false, 00:14:23.816 "seek_data": false, 00:14:23.816 "copy": false, 00:14:23.816 "nvme_iov_md": false 00:14:23.816 }, 00:14:23.816 "memory_domains": [ 00:14:23.816 { 00:14:23.816 "dma_device_id": "system", 00:14:23.816 "dma_device_type": 1 00:14:23.816 }, 00:14:23.816 { 00:14:23.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.816 "dma_device_type": 2 00:14:23.816 }, 00:14:23.816 { 00:14:23.816 "dma_device_id": "system", 00:14:23.816 "dma_device_type": 1 00:14:23.816 }, 00:14:23.816 { 00:14:23.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.816 "dma_device_type": 2 00:14:23.816 }, 00:14:23.816 { 00:14:23.816 "dma_device_id": "system", 00:14:23.816 "dma_device_type": 1 00:14:23.816 }, 00:14:23.816 { 00:14:23.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.816 "dma_device_type": 2 00:14:23.816 } 00:14:23.816 ], 00:14:23.816 "driver_specific": { 00:14:23.816 "raid": { 00:14:23.816 "uuid": "8363a717-86a8-4e2a-b89f-e08c4265269a", 00:14:23.816 "strip_size_kb": 64, 00:14:23.816 "state": "online", 00:14:23.816 "raid_level": "concat", 00:14:23.817 "superblock": true, 00:14:23.817 "num_base_bdevs": 3, 00:14:23.817 "num_base_bdevs_discovered": 3, 00:14:23.817 "num_base_bdevs_operational": 3, 00:14:23.817 "base_bdevs_list": [ 00:14:23.817 { 00:14:23.817 "name": "pt1", 00:14:23.817 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:23.817 "is_configured": true, 00:14:23.817 "data_offset": 2048, 00:14:23.817 "data_size": 63488 00:14:23.817 }, 00:14:23.817 { 00:14:23.817 "name": "pt2", 00:14:23.817 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:23.817 "is_configured": true, 00:14:23.817 "data_offset": 2048, 00:14:23.817 "data_size": 63488 00:14:23.817 }, 00:14:23.817 { 00:14:23.817 "name": "pt3", 00:14:23.817 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:23.817 "is_configured": true, 00:14:23.817 "data_offset": 2048, 00:14:23.817 "data_size": 63488 00:14:23.817 } 00:14:23.817 ] 00:14:23.817 } 00:14:23.817 } 00:14:23.817 }' 00:14:23.817 17:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:23.817 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:23.817 pt2 00:14:23.817 pt3' 00:14:23.817 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:23.817 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:23.817 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:24.076 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:24.076 "name": "pt1", 00:14:24.076 "aliases": [ 00:14:24.076 "00000000-0000-0000-0000-000000000001" 00:14:24.076 ], 00:14:24.076 "product_name": "passthru", 00:14:24.076 "block_size": 512, 00:14:24.076 "num_blocks": 65536, 00:14:24.076 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:24.076 "assigned_rate_limits": { 00:14:24.076 "rw_ios_per_sec": 0, 00:14:24.076 "rw_mbytes_per_sec": 0, 00:14:24.076 "r_mbytes_per_sec": 0, 00:14:24.076 "w_mbytes_per_sec": 0 00:14:24.076 }, 00:14:24.076 "claimed": true, 00:14:24.076 "claim_type": "exclusive_write", 00:14:24.076 "zoned": false, 00:14:24.076 "supported_io_types": { 00:14:24.076 "read": true, 00:14:24.076 "write": true, 00:14:24.076 "unmap": true, 00:14:24.076 "flush": true, 00:14:24.076 "reset": true, 00:14:24.076 "nvme_admin": false, 00:14:24.076 "nvme_io": false, 00:14:24.076 "nvme_io_md": false, 00:14:24.076 "write_zeroes": true, 00:14:24.076 "zcopy": true, 00:14:24.076 "get_zone_info": false, 00:14:24.076 "zone_management": false, 00:14:24.076 "zone_append": false, 00:14:24.076 "compare": false, 00:14:24.076 "compare_and_write": false, 00:14:24.076 "abort": true, 00:14:24.076 "seek_hole": false, 00:14:24.076 "seek_data": false, 00:14:24.076 "copy": true, 00:14:24.076 "nvme_iov_md": false 00:14:24.076 }, 00:14:24.077 "memory_domains": [ 00:14:24.077 { 00:14:24.077 "dma_device_id": "system", 00:14:24.077 "dma_device_type": 1 00:14:24.077 }, 00:14:24.077 { 00:14:24.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:24.077 "dma_device_type": 2 00:14:24.077 } 00:14:24.077 ], 00:14:24.077 "driver_specific": { 00:14:24.077 "passthru": { 00:14:24.077 "name": "pt1", 00:14:24.077 "base_bdev_name": "malloc1" 00:14:24.077 } 00:14:24.077 } 00:14:24.077 }' 00:14:24.077 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.077 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.077 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:24.077 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.077 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:24.336 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:24.596 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:24.596 "name": "pt2", 00:14:24.596 "aliases": [ 00:14:24.596 "00000000-0000-0000-0000-000000000002" 00:14:24.596 ], 00:14:24.596 "product_name": "passthru", 00:14:24.596 "block_size": 512, 00:14:24.596 "num_blocks": 65536, 00:14:24.596 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:24.596 "assigned_rate_limits": { 00:14:24.596 "rw_ios_per_sec": 0, 00:14:24.596 "rw_mbytes_per_sec": 0, 00:14:24.596 "r_mbytes_per_sec": 0, 00:14:24.596 "w_mbytes_per_sec": 0 00:14:24.596 }, 00:14:24.596 "claimed": true, 00:14:24.596 "claim_type": "exclusive_write", 00:14:24.596 "zoned": false, 00:14:24.596 "supported_io_types": { 00:14:24.596 "read": true, 00:14:24.596 "write": true, 00:14:24.596 "unmap": true, 00:14:24.596 "flush": true, 00:14:24.596 "reset": true, 00:14:24.596 "nvme_admin": false, 00:14:24.596 "nvme_io": false, 00:14:24.596 "nvme_io_md": false, 00:14:24.596 "write_zeroes": true, 00:14:24.596 "zcopy": true, 00:14:24.596 "get_zone_info": false, 00:14:24.596 "zone_management": false, 00:14:24.596 "zone_append": false, 00:14:24.596 "compare": false, 00:14:24.596 "compare_and_write": false, 00:14:24.596 "abort": true, 00:14:24.596 "seek_hole": false, 00:14:24.596 "seek_data": false, 00:14:24.596 "copy": true, 00:14:24.596 "nvme_iov_md": false 00:14:24.596 }, 00:14:24.596 "memory_domains": [ 00:14:24.596 { 00:14:24.596 "dma_device_id": "system", 00:14:24.596 "dma_device_type": 1 00:14:24.596 }, 00:14:24.596 { 00:14:24.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:24.596 "dma_device_type": 2 00:14:24.596 } 00:14:24.596 ], 00:14:24.596 "driver_specific": { 00:14:24.596 "passthru": { 00:14:24.596 "name": "pt2", 00:14:24.596 "base_bdev_name": "malloc2" 00:14:24.596 } 00:14:24.596 } 00:14:24.596 }' 00:14:24.596 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.596 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.596 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:24.596 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.596 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.857 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:24.857 17:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.857 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.857 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:24.857 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.857 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.857 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:24.857 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:24.857 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:24.857 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:25.117 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:25.117 "name": "pt3", 00:14:25.117 "aliases": [ 00:14:25.117 "00000000-0000-0000-0000-000000000003" 00:14:25.117 ], 00:14:25.117 "product_name": "passthru", 00:14:25.117 "block_size": 512, 00:14:25.117 "num_blocks": 65536, 00:14:25.117 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:25.117 "assigned_rate_limits": { 00:14:25.117 "rw_ios_per_sec": 0, 00:14:25.117 "rw_mbytes_per_sec": 0, 00:14:25.117 "r_mbytes_per_sec": 0, 00:14:25.117 "w_mbytes_per_sec": 0 00:14:25.117 }, 00:14:25.117 "claimed": true, 00:14:25.117 "claim_type": "exclusive_write", 00:14:25.117 "zoned": false, 00:14:25.117 "supported_io_types": { 00:14:25.117 "read": true, 00:14:25.117 "write": true, 00:14:25.117 "unmap": true, 00:14:25.117 "flush": true, 00:14:25.117 "reset": true, 00:14:25.117 "nvme_admin": false, 00:14:25.117 "nvme_io": false, 00:14:25.117 "nvme_io_md": false, 00:14:25.117 "write_zeroes": true, 00:14:25.117 "zcopy": true, 00:14:25.117 "get_zone_info": false, 00:14:25.117 "zone_management": false, 00:14:25.117 "zone_append": false, 00:14:25.117 "compare": false, 00:14:25.117 "compare_and_write": false, 00:14:25.117 "abort": true, 00:14:25.117 "seek_hole": false, 00:14:25.117 "seek_data": false, 00:14:25.117 "copy": true, 00:14:25.117 "nvme_iov_md": false 00:14:25.117 }, 00:14:25.117 "memory_domains": [ 00:14:25.117 { 00:14:25.117 "dma_device_id": "system", 00:14:25.117 "dma_device_type": 1 00:14:25.117 }, 00:14:25.117 { 00:14:25.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:25.117 "dma_device_type": 2 00:14:25.117 } 00:14:25.117 ], 00:14:25.117 "driver_specific": { 00:14:25.117 "passthru": { 00:14:25.117 "name": "pt3", 00:14:25.117 "base_bdev_name": "malloc3" 00:14:25.117 } 00:14:25.117 } 00:14:25.117 }' 00:14:25.117 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:25.117 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:25.117 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:25.377 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:25.377 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:25.377 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:25.377 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:25.377 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:25.377 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:25.377 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:25.377 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:25.637 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:25.637 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:25.637 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:25.637 [2024-07-15 17:26:36.863455] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:25.637 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8363a717-86a8-4e2a-b89f-e08c4265269a '!=' 8363a717-86a8-4e2a-b89f-e08c4265269a ']' 00:14:25.637 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:14:25.637 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:25.637 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:25.637 17:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2786210 00:14:25.637 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2786210 ']' 00:14:25.638 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2786210 00:14:25.638 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:25.638 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:25.638 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2786210 00:14:25.638 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:25.638 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:25.638 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2786210' 00:14:25.638 killing process with pid 2786210 00:14:25.638 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2786210 00:14:25.638 [2024-07-15 17:26:36.933149] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:25.638 [2024-07-15 17:26:36.933186] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:25.638 [2024-07-15 17:26:36.933222] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:25.638 [2024-07-15 17:26:36.933227] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e58dc0 name raid_bdev1, state offline 00:14:25.638 17:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2786210 00:14:25.898 [2024-07-15 17:26:36.948099] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:25.898 17:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:25.898 00:14:25.898 real 0m12.040s 00:14:25.898 user 0m22.126s 00:14:25.898 sys 0m1.759s 00:14:25.898 17:26:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:25.898 17:26:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.898 ************************************ 00:14:25.898 END TEST raid_superblock_test 00:14:25.898 ************************************ 00:14:25.898 17:26:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:25.898 17:26:37 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:14:25.898 17:26:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:25.898 17:26:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:25.898 17:26:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:25.898 ************************************ 00:14:25.898 START TEST raid_read_error_test 00:14:25.898 ************************************ 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Ua024xoX5M 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2788561 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2788561 /var/tmp/spdk-raid.sock 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2788561 ']' 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:25.898 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:25.898 17:26:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.158 [2024-07-15 17:26:37.207167] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:14:26.158 [2024-07-15 17:26:37.207209] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2788561 ] 00:14:26.158 [2024-07-15 17:26:37.292878] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:26.158 [2024-07-15 17:26:37.354949] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:26.158 [2024-07-15 17:26:37.402739] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:26.158 [2024-07-15 17:26:37.402767] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:26.740 17:26:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:26.740 17:26:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:26.740 17:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:26.740 17:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:27.001 BaseBdev1_malloc 00:14:27.001 17:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:27.261 true 00:14:27.261 17:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:27.522 [2024-07-15 17:26:38.594124] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:27.522 [2024-07-15 17:26:38.594153] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:27.522 [2024-07-15 17:26:38.594164] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2925b50 00:14:27.522 [2024-07-15 17:26:38.594170] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:27.522 [2024-07-15 17:26:38.595482] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:27.522 [2024-07-15 17:26:38.595505] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:27.522 BaseBdev1 00:14:27.522 17:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:27.522 17:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:27.522 BaseBdev2_malloc 00:14:27.522 17:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:27.782 true 00:14:27.782 17:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:28.041 [2024-07-15 17:26:39.161440] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:28.041 [2024-07-15 17:26:39.161468] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:28.041 [2024-07-15 17:26:39.161480] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2909ea0 00:14:28.041 [2024-07-15 17:26:39.161486] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:28.041 [2024-07-15 17:26:39.162675] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:28.041 [2024-07-15 17:26:39.162694] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:28.041 BaseBdev2 00:14:28.041 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:28.042 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:28.301 BaseBdev3_malloc 00:14:28.301 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:28.301 true 00:14:28.302 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:28.561 [2024-07-15 17:26:39.736841] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:28.561 [2024-07-15 17:26:39.736867] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:28.561 [2024-07-15 17:26:39.736880] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x290dfb0 00:14:28.561 [2024-07-15 17:26:39.736886] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:28.561 [2024-07-15 17:26:39.738073] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:28.561 [2024-07-15 17:26:39.738092] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:28.561 BaseBdev3 00:14:28.561 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:28.822 [2024-07-15 17:26:39.925341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:28.822 [2024-07-15 17:26:39.926349] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:28.822 [2024-07-15 17:26:39.926402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:28.822 [2024-07-15 17:26:39.926553] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x290f0e0 00:14:28.822 [2024-07-15 17:26:39.926560] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:28.822 [2024-07-15 17:26:39.926703] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2771250 00:14:28.822 [2024-07-15 17:26:39.926825] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x290f0e0 00:14:28.822 [2024-07-15 17:26:39.926831] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x290f0e0 00:14:28.822 [2024-07-15 17:26:39.926911] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.822 17:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:29.083 17:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:29.083 "name": "raid_bdev1", 00:14:29.083 "uuid": "3cd16125-8875-4b41-85b0-d6ccbbcd2978", 00:14:29.083 "strip_size_kb": 64, 00:14:29.083 "state": "online", 00:14:29.083 "raid_level": "concat", 00:14:29.083 "superblock": true, 00:14:29.083 "num_base_bdevs": 3, 00:14:29.083 "num_base_bdevs_discovered": 3, 00:14:29.083 "num_base_bdevs_operational": 3, 00:14:29.083 "base_bdevs_list": [ 00:14:29.083 { 00:14:29.083 "name": "BaseBdev1", 00:14:29.083 "uuid": "b8433217-7333-5f3f-9225-db3c59bbb1be", 00:14:29.083 "is_configured": true, 00:14:29.083 "data_offset": 2048, 00:14:29.083 "data_size": 63488 00:14:29.083 }, 00:14:29.083 { 00:14:29.083 "name": "BaseBdev2", 00:14:29.083 "uuid": "acf6f130-08ea-576e-b145-93edb4f31548", 00:14:29.083 "is_configured": true, 00:14:29.083 "data_offset": 2048, 00:14:29.083 "data_size": 63488 00:14:29.083 }, 00:14:29.083 { 00:14:29.083 "name": "BaseBdev3", 00:14:29.083 "uuid": "204289bb-c1a7-5a01-8513-37c64d78209a", 00:14:29.083 "is_configured": true, 00:14:29.083 "data_offset": 2048, 00:14:29.083 "data_size": 63488 00:14:29.083 } 00:14:29.083 ] 00:14:29.083 }' 00:14:29.083 17:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:29.083 17:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.654 17:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:29.654 17:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:29.654 [2024-07-15 17:26:40.767688] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x290eda0 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.612 17:26:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:30.871 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.871 "name": "raid_bdev1", 00:14:30.871 "uuid": "3cd16125-8875-4b41-85b0-d6ccbbcd2978", 00:14:30.871 "strip_size_kb": 64, 00:14:30.871 "state": "online", 00:14:30.871 "raid_level": "concat", 00:14:30.871 "superblock": true, 00:14:30.871 "num_base_bdevs": 3, 00:14:30.871 "num_base_bdevs_discovered": 3, 00:14:30.871 "num_base_bdevs_operational": 3, 00:14:30.871 "base_bdevs_list": [ 00:14:30.871 { 00:14:30.871 "name": "BaseBdev1", 00:14:30.871 "uuid": "b8433217-7333-5f3f-9225-db3c59bbb1be", 00:14:30.871 "is_configured": true, 00:14:30.871 "data_offset": 2048, 00:14:30.871 "data_size": 63488 00:14:30.871 }, 00:14:30.871 { 00:14:30.871 "name": "BaseBdev2", 00:14:30.871 "uuid": "acf6f130-08ea-576e-b145-93edb4f31548", 00:14:30.871 "is_configured": true, 00:14:30.871 "data_offset": 2048, 00:14:30.871 "data_size": 63488 00:14:30.871 }, 00:14:30.871 { 00:14:30.871 "name": "BaseBdev3", 00:14:30.871 "uuid": "204289bb-c1a7-5a01-8513-37c64d78209a", 00:14:30.871 "is_configured": true, 00:14:30.871 "data_offset": 2048, 00:14:30.871 "data_size": 63488 00:14:30.871 } 00:14:30.871 ] 00:14:30.871 }' 00:14:30.871 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.871 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.441 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:31.441 [2024-07-15 17:26:42.737951] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:31.441 [2024-07-15 17:26:42.737983] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:31.701 [2024-07-15 17:26:42.740563] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:31.701 [2024-07-15 17:26:42.740589] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:31.701 [2024-07-15 17:26:42.740612] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:31.701 [2024-07-15 17:26:42.740618] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x290f0e0 name raid_bdev1, state offline 00:14:31.701 0 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2788561 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2788561 ']' 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2788561 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2788561 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2788561' 00:14:31.701 killing process with pid 2788561 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2788561 00:14:31.701 [2024-07-15 17:26:42.824229] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2788561 00:14:31.701 [2024-07-15 17:26:42.835177] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Ua024xoX5M 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.51 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.51 != \0\.\0\0 ]] 00:14:31.701 00:14:31.701 real 0m5.827s 00:14:31.701 user 0m9.256s 00:14:31.701 sys 0m0.828s 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:31.701 17:26:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.701 ************************************ 00:14:31.701 END TEST raid_read_error_test 00:14:31.701 ************************************ 00:14:31.962 17:26:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:31.962 17:26:43 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:14:31.962 17:26:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:31.962 17:26:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:31.962 17:26:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:31.962 ************************************ 00:14:31.962 START TEST raid_write_error_test 00:14:31.962 ************************************ 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.OOcGa7s2ro 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2789603 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2789603 /var/tmp/spdk-raid.sock 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2789603 ']' 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:31.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:31.962 17:26:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.962 [2024-07-15 17:26:43.100257] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:14:31.962 [2024-07-15 17:26:43.100310] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2789603 ] 00:14:31.962 [2024-07-15 17:26:43.191665] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:32.221 [2024-07-15 17:26:43.268688] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:32.221 [2024-07-15 17:26:43.321038] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:32.221 [2024-07-15 17:26:43.321065] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:32.480 17:26:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:32.481 17:26:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:32.481 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:32.481 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:32.740 BaseBdev1_malloc 00:14:32.740 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:32.740 true 00:14:32.740 17:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:33.000 [2024-07-15 17:26:44.144289] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:33.000 [2024-07-15 17:26:44.144319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:33.000 [2024-07-15 17:26:44.144330] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21acb50 00:14:33.000 [2024-07-15 17:26:44.144337] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:33.000 [2024-07-15 17:26:44.145641] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:33.000 [2024-07-15 17:26:44.145660] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:33.000 BaseBdev1 00:14:33.000 17:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:33.000 17:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:33.259 BaseBdev2_malloc 00:14:33.259 17:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:33.259 true 00:14:33.519 17:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:33.519 [2024-07-15 17:26:44.731668] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:33.519 [2024-07-15 17:26:44.731695] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:33.519 [2024-07-15 17:26:44.731707] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2190ea0 00:14:33.519 [2024-07-15 17:26:44.731719] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:33.519 [2024-07-15 17:26:44.732900] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:33.519 [2024-07-15 17:26:44.732919] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:33.519 BaseBdev2 00:14:33.519 17:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:33.519 17:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:33.779 BaseBdev3_malloc 00:14:33.779 17:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:34.039 true 00:14:34.039 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:34.039 [2024-07-15 17:26:45.294967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:34.039 [2024-07-15 17:26:45.294994] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:34.039 [2024-07-15 17:26:45.295008] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2194fb0 00:14:34.039 [2024-07-15 17:26:45.295014] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:34.039 [2024-07-15 17:26:45.296195] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:34.039 [2024-07-15 17:26:45.296214] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:34.039 BaseBdev3 00:14:34.039 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:34.300 [2024-07-15 17:26:45.483465] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:34.300 [2024-07-15 17:26:45.484474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:34.300 [2024-07-15 17:26:45.484526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:34.300 [2024-07-15 17:26:45.484678] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21960e0 00:14:34.300 [2024-07-15 17:26:45.484686] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:34.300 [2024-07-15 17:26:45.484835] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ff8250 00:14:34.300 [2024-07-15 17:26:45.484949] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21960e0 00:14:34.300 [2024-07-15 17:26:45.484955] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21960e0 00:14:34.300 [2024-07-15 17:26:45.485029] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.300 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:34.561 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.561 "name": "raid_bdev1", 00:14:34.561 "uuid": "e381ff6a-1457-4891-97de-7ffd3c1efaf2", 00:14:34.561 "strip_size_kb": 64, 00:14:34.561 "state": "online", 00:14:34.561 "raid_level": "concat", 00:14:34.561 "superblock": true, 00:14:34.561 "num_base_bdevs": 3, 00:14:34.561 "num_base_bdevs_discovered": 3, 00:14:34.561 "num_base_bdevs_operational": 3, 00:14:34.561 "base_bdevs_list": [ 00:14:34.561 { 00:14:34.561 "name": "BaseBdev1", 00:14:34.561 "uuid": "690e565d-7e0d-55b6-adf3-2a2e7613f3fc", 00:14:34.561 "is_configured": true, 00:14:34.561 "data_offset": 2048, 00:14:34.561 "data_size": 63488 00:14:34.561 }, 00:14:34.561 { 00:14:34.561 "name": "BaseBdev2", 00:14:34.561 "uuid": "6570fb0b-e4e6-5bae-b7cf-cd2ec172f6b3", 00:14:34.561 "is_configured": true, 00:14:34.561 "data_offset": 2048, 00:14:34.561 "data_size": 63488 00:14:34.561 }, 00:14:34.561 { 00:14:34.561 "name": "BaseBdev3", 00:14:34.561 "uuid": "3ac5e966-7b8d-5def-96e9-d62329bfc1ea", 00:14:34.561 "is_configured": true, 00:14:34.561 "data_offset": 2048, 00:14:34.561 "data_size": 63488 00:14:34.561 } 00:14:34.561 ] 00:14:34.561 }' 00:14:34.561 17:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.561 17:26:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.131 17:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:35.131 17:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:35.131 [2024-07-15 17:26:46.301746] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2195da0 00:14:36.069 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.329 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.330 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.330 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:36.330 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.330 "name": "raid_bdev1", 00:14:36.330 "uuid": "e381ff6a-1457-4891-97de-7ffd3c1efaf2", 00:14:36.330 "strip_size_kb": 64, 00:14:36.330 "state": "online", 00:14:36.330 "raid_level": "concat", 00:14:36.330 "superblock": true, 00:14:36.330 "num_base_bdevs": 3, 00:14:36.330 "num_base_bdevs_discovered": 3, 00:14:36.330 "num_base_bdevs_operational": 3, 00:14:36.330 "base_bdevs_list": [ 00:14:36.330 { 00:14:36.330 "name": "BaseBdev1", 00:14:36.330 "uuid": "690e565d-7e0d-55b6-adf3-2a2e7613f3fc", 00:14:36.330 "is_configured": true, 00:14:36.330 "data_offset": 2048, 00:14:36.330 "data_size": 63488 00:14:36.330 }, 00:14:36.330 { 00:14:36.330 "name": "BaseBdev2", 00:14:36.330 "uuid": "6570fb0b-e4e6-5bae-b7cf-cd2ec172f6b3", 00:14:36.330 "is_configured": true, 00:14:36.330 "data_offset": 2048, 00:14:36.330 "data_size": 63488 00:14:36.330 }, 00:14:36.330 { 00:14:36.330 "name": "BaseBdev3", 00:14:36.330 "uuid": "3ac5e966-7b8d-5def-96e9-d62329bfc1ea", 00:14:36.330 "is_configured": true, 00:14:36.330 "data_offset": 2048, 00:14:36.330 "data_size": 63488 00:14:36.330 } 00:14:36.330 ] 00:14:36.330 }' 00:14:36.330 17:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.330 17:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:36.897 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:37.157 [2024-07-15 17:26:48.244310] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:37.157 [2024-07-15 17:26:48.244342] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:37.157 [2024-07-15 17:26:48.246979] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:37.157 [2024-07-15 17:26:48.247006] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:37.157 [2024-07-15 17:26:48.247030] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:37.157 [2024-07-15 17:26:48.247036] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21960e0 name raid_bdev1, state offline 00:14:37.157 0 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2789603 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2789603 ']' 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2789603 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2789603 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2789603' 00:14:37.157 killing process with pid 2789603 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2789603 00:14:37.157 [2024-07-15 17:26:48.333172] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:37.157 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2789603 00:14:37.157 [2024-07-15 17:26:48.344286] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:37.417 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.OOcGa7s2ro 00:14:37.417 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:37.417 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:37.417 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:14:37.417 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:14:37.417 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:37.417 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:37.418 17:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:14:37.418 00:14:37.418 real 0m5.446s 00:14:37.418 user 0m9.009s 00:14:37.418 sys 0m0.795s 00:14:37.418 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:37.418 17:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.418 ************************************ 00:14:37.418 END TEST raid_write_error_test 00:14:37.418 ************************************ 00:14:37.418 17:26:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:37.418 17:26:48 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:37.418 17:26:48 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:14:37.418 17:26:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:37.418 17:26:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:37.418 17:26:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:37.418 ************************************ 00:14:37.418 START TEST raid_state_function_test 00:14:37.418 ************************************ 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2790621 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2790621' 00:14:37.418 Process raid pid: 2790621 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2790621 /var/tmp/spdk-raid.sock 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2790621 ']' 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:37.418 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:37.418 17:26:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.418 [2024-07-15 17:26:48.625551] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:14:37.418 [2024-07-15 17:26:48.625604] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:37.677 [2024-07-15 17:26:48.718895] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:37.677 [2024-07-15 17:26:48.785982] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:37.677 [2024-07-15 17:26:48.833323] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:37.677 [2024-07-15 17:26:48.833345] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:38.245 17:26:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:38.245 17:26:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:38.245 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:38.504 [2024-07-15 17:26:49.568462] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:38.504 [2024-07-15 17:26:49.568495] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:38.504 [2024-07-15 17:26:49.568501] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:38.504 [2024-07-15 17:26:49.568507] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:38.504 [2024-07-15 17:26:49.568512] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:38.505 [2024-07-15 17:26:49.568518] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.505 "name": "Existed_Raid", 00:14:38.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.505 "strip_size_kb": 0, 00:14:38.505 "state": "configuring", 00:14:38.505 "raid_level": "raid1", 00:14:38.505 "superblock": false, 00:14:38.505 "num_base_bdevs": 3, 00:14:38.505 "num_base_bdevs_discovered": 0, 00:14:38.505 "num_base_bdevs_operational": 3, 00:14:38.505 "base_bdevs_list": [ 00:14:38.505 { 00:14:38.505 "name": "BaseBdev1", 00:14:38.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.505 "is_configured": false, 00:14:38.505 "data_offset": 0, 00:14:38.505 "data_size": 0 00:14:38.505 }, 00:14:38.505 { 00:14:38.505 "name": "BaseBdev2", 00:14:38.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.505 "is_configured": false, 00:14:38.505 "data_offset": 0, 00:14:38.505 "data_size": 0 00:14:38.505 }, 00:14:38.505 { 00:14:38.505 "name": "BaseBdev3", 00:14:38.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.505 "is_configured": false, 00:14:38.505 "data_offset": 0, 00:14:38.505 "data_size": 0 00:14:38.505 } 00:14:38.505 ] 00:14:38.505 }' 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.505 17:26:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:39.074 17:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:39.335 [2024-07-15 17:26:50.478662] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:39.335 [2024-07-15 17:26:50.478683] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24446d0 name Existed_Raid, state configuring 00:14:39.335 17:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:39.595 [2024-07-15 17:26:50.671163] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:39.595 [2024-07-15 17:26:50.671183] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:39.595 [2024-07-15 17:26:50.671188] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:39.595 [2024-07-15 17:26:50.671194] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:39.595 [2024-07-15 17:26:50.671198] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:39.595 [2024-07-15 17:26:50.671204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:39.595 17:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:39.595 [2024-07-15 17:26:50.874173] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:39.595 BaseBdev1 00:14:39.595 17:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:39.595 17:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:39.595 17:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:39.595 17:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:39.595 17:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:39.595 17:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:39.595 17:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.862 17:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:40.123 [ 00:14:40.123 { 00:14:40.123 "name": "BaseBdev1", 00:14:40.123 "aliases": [ 00:14:40.123 "ad18d031-7754-4bef-8d27-f973058489e6" 00:14:40.123 ], 00:14:40.123 "product_name": "Malloc disk", 00:14:40.123 "block_size": 512, 00:14:40.123 "num_blocks": 65536, 00:14:40.123 "uuid": "ad18d031-7754-4bef-8d27-f973058489e6", 00:14:40.123 "assigned_rate_limits": { 00:14:40.123 "rw_ios_per_sec": 0, 00:14:40.123 "rw_mbytes_per_sec": 0, 00:14:40.123 "r_mbytes_per_sec": 0, 00:14:40.123 "w_mbytes_per_sec": 0 00:14:40.123 }, 00:14:40.123 "claimed": true, 00:14:40.123 "claim_type": "exclusive_write", 00:14:40.123 "zoned": false, 00:14:40.123 "supported_io_types": { 00:14:40.123 "read": true, 00:14:40.123 "write": true, 00:14:40.123 "unmap": true, 00:14:40.123 "flush": true, 00:14:40.123 "reset": true, 00:14:40.123 "nvme_admin": false, 00:14:40.123 "nvme_io": false, 00:14:40.123 "nvme_io_md": false, 00:14:40.123 "write_zeroes": true, 00:14:40.123 "zcopy": true, 00:14:40.123 "get_zone_info": false, 00:14:40.123 "zone_management": false, 00:14:40.123 "zone_append": false, 00:14:40.123 "compare": false, 00:14:40.123 "compare_and_write": false, 00:14:40.123 "abort": true, 00:14:40.123 "seek_hole": false, 00:14:40.123 "seek_data": false, 00:14:40.123 "copy": true, 00:14:40.123 "nvme_iov_md": false 00:14:40.123 }, 00:14:40.123 "memory_domains": [ 00:14:40.123 { 00:14:40.123 "dma_device_id": "system", 00:14:40.123 "dma_device_type": 1 00:14:40.123 }, 00:14:40.123 { 00:14:40.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.123 "dma_device_type": 2 00:14:40.123 } 00:14:40.123 ], 00:14:40.123 "driver_specific": {} 00:14:40.123 } 00:14:40.123 ] 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.123 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.383 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.383 "name": "Existed_Raid", 00:14:40.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.383 "strip_size_kb": 0, 00:14:40.383 "state": "configuring", 00:14:40.383 "raid_level": "raid1", 00:14:40.383 "superblock": false, 00:14:40.383 "num_base_bdevs": 3, 00:14:40.383 "num_base_bdevs_discovered": 1, 00:14:40.383 "num_base_bdevs_operational": 3, 00:14:40.383 "base_bdevs_list": [ 00:14:40.383 { 00:14:40.383 "name": "BaseBdev1", 00:14:40.383 "uuid": "ad18d031-7754-4bef-8d27-f973058489e6", 00:14:40.383 "is_configured": true, 00:14:40.383 "data_offset": 0, 00:14:40.383 "data_size": 65536 00:14:40.383 }, 00:14:40.383 { 00:14:40.383 "name": "BaseBdev2", 00:14:40.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.383 "is_configured": false, 00:14:40.383 "data_offset": 0, 00:14:40.383 "data_size": 0 00:14:40.383 }, 00:14:40.383 { 00:14:40.383 "name": "BaseBdev3", 00:14:40.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.383 "is_configured": false, 00:14:40.383 "data_offset": 0, 00:14:40.383 "data_size": 0 00:14:40.383 } 00:14:40.383 ] 00:14:40.383 }' 00:14:40.383 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.383 17:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.955 17:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:40.955 [2024-07-15 17:26:52.125333] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:40.955 [2024-07-15 17:26:52.125359] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2443fa0 name Existed_Raid, state configuring 00:14:40.955 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:41.252 [2024-07-15 17:26:52.285768] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:41.252 [2024-07-15 17:26:52.286913] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:41.252 [2024-07-15 17:26:52.286936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:41.252 [2024-07-15 17:26:52.286942] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:41.252 [2024-07-15 17:26:52.286952] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.252 "name": "Existed_Raid", 00:14:41.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.252 "strip_size_kb": 0, 00:14:41.252 "state": "configuring", 00:14:41.252 "raid_level": "raid1", 00:14:41.252 "superblock": false, 00:14:41.252 "num_base_bdevs": 3, 00:14:41.252 "num_base_bdevs_discovered": 1, 00:14:41.252 "num_base_bdevs_operational": 3, 00:14:41.252 "base_bdevs_list": [ 00:14:41.252 { 00:14:41.252 "name": "BaseBdev1", 00:14:41.252 "uuid": "ad18d031-7754-4bef-8d27-f973058489e6", 00:14:41.252 "is_configured": true, 00:14:41.252 "data_offset": 0, 00:14:41.252 "data_size": 65536 00:14:41.252 }, 00:14:41.252 { 00:14:41.252 "name": "BaseBdev2", 00:14:41.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.252 "is_configured": false, 00:14:41.252 "data_offset": 0, 00:14:41.252 "data_size": 0 00:14:41.252 }, 00:14:41.252 { 00:14:41.252 "name": "BaseBdev3", 00:14:41.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.252 "is_configured": false, 00:14:41.252 "data_offset": 0, 00:14:41.252 "data_size": 0 00:14:41.252 } 00:14:41.252 ] 00:14:41.252 }' 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.252 17:26:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.822 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:42.092 [2024-07-15 17:26:53.237096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:42.092 BaseBdev2 00:14:42.092 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:42.092 17:26:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:42.092 17:26:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:42.092 17:26:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:42.092 17:26:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:42.092 17:26:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:42.092 17:26:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:42.354 [ 00:14:42.354 { 00:14:42.354 "name": "BaseBdev2", 00:14:42.354 "aliases": [ 00:14:42.354 "5496149b-ffb3-47b0-87b5-2d536bc15059" 00:14:42.354 ], 00:14:42.354 "product_name": "Malloc disk", 00:14:42.354 "block_size": 512, 00:14:42.354 "num_blocks": 65536, 00:14:42.354 "uuid": "5496149b-ffb3-47b0-87b5-2d536bc15059", 00:14:42.354 "assigned_rate_limits": { 00:14:42.354 "rw_ios_per_sec": 0, 00:14:42.354 "rw_mbytes_per_sec": 0, 00:14:42.354 "r_mbytes_per_sec": 0, 00:14:42.354 "w_mbytes_per_sec": 0 00:14:42.354 }, 00:14:42.354 "claimed": true, 00:14:42.354 "claim_type": "exclusive_write", 00:14:42.354 "zoned": false, 00:14:42.354 "supported_io_types": { 00:14:42.354 "read": true, 00:14:42.354 "write": true, 00:14:42.354 "unmap": true, 00:14:42.354 "flush": true, 00:14:42.354 "reset": true, 00:14:42.354 "nvme_admin": false, 00:14:42.354 "nvme_io": false, 00:14:42.354 "nvme_io_md": false, 00:14:42.354 "write_zeroes": true, 00:14:42.354 "zcopy": true, 00:14:42.354 "get_zone_info": false, 00:14:42.354 "zone_management": false, 00:14:42.354 "zone_append": false, 00:14:42.354 "compare": false, 00:14:42.354 "compare_and_write": false, 00:14:42.354 "abort": true, 00:14:42.354 "seek_hole": false, 00:14:42.354 "seek_data": false, 00:14:42.354 "copy": true, 00:14:42.354 "nvme_iov_md": false 00:14:42.354 }, 00:14:42.354 "memory_domains": [ 00:14:42.354 { 00:14:42.354 "dma_device_id": "system", 00:14:42.354 "dma_device_type": 1 00:14:42.354 }, 00:14:42.354 { 00:14:42.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.354 "dma_device_type": 2 00:14:42.354 } 00:14:42.354 ], 00:14:42.354 "driver_specific": {} 00:14:42.354 } 00:14:42.354 ] 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.354 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.614 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.614 "name": "Existed_Raid", 00:14:42.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.614 "strip_size_kb": 0, 00:14:42.614 "state": "configuring", 00:14:42.614 "raid_level": "raid1", 00:14:42.614 "superblock": false, 00:14:42.614 "num_base_bdevs": 3, 00:14:42.614 "num_base_bdevs_discovered": 2, 00:14:42.614 "num_base_bdevs_operational": 3, 00:14:42.614 "base_bdevs_list": [ 00:14:42.614 { 00:14:42.614 "name": "BaseBdev1", 00:14:42.614 "uuid": "ad18d031-7754-4bef-8d27-f973058489e6", 00:14:42.614 "is_configured": true, 00:14:42.614 "data_offset": 0, 00:14:42.614 "data_size": 65536 00:14:42.614 }, 00:14:42.614 { 00:14:42.614 "name": "BaseBdev2", 00:14:42.614 "uuid": "5496149b-ffb3-47b0-87b5-2d536bc15059", 00:14:42.614 "is_configured": true, 00:14:42.614 "data_offset": 0, 00:14:42.614 "data_size": 65536 00:14:42.614 }, 00:14:42.614 { 00:14:42.614 "name": "BaseBdev3", 00:14:42.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.614 "is_configured": false, 00:14:42.614 "data_offset": 0, 00:14:42.614 "data_size": 0 00:14:42.614 } 00:14:42.614 ] 00:14:42.614 }' 00:14:42.614 17:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.614 17:26:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.183 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:43.442 [2024-07-15 17:26:54.557262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:43.442 [2024-07-15 17:26:54.557291] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2444e90 00:14:43.442 [2024-07-15 17:26:54.557296] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:14:43.442 [2024-07-15 17:26:54.557444] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2444b60 00:14:43.442 [2024-07-15 17:26:54.557543] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2444e90 00:14:43.442 [2024-07-15 17:26:54.557549] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2444e90 00:14:43.442 [2024-07-15 17:26:54.557670] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:43.442 BaseBdev3 00:14:43.443 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:43.443 17:26:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:43.443 17:26:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:43.443 17:26:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:43.443 17:26:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:43.443 17:26:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:43.443 17:26:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:43.703 [ 00:14:43.703 { 00:14:43.703 "name": "BaseBdev3", 00:14:43.703 "aliases": [ 00:14:43.703 "c90e8613-8125-40d3-a357-360f514ee93c" 00:14:43.703 ], 00:14:43.703 "product_name": "Malloc disk", 00:14:43.703 "block_size": 512, 00:14:43.703 "num_blocks": 65536, 00:14:43.703 "uuid": "c90e8613-8125-40d3-a357-360f514ee93c", 00:14:43.703 "assigned_rate_limits": { 00:14:43.703 "rw_ios_per_sec": 0, 00:14:43.703 "rw_mbytes_per_sec": 0, 00:14:43.703 "r_mbytes_per_sec": 0, 00:14:43.703 "w_mbytes_per_sec": 0 00:14:43.703 }, 00:14:43.703 "claimed": true, 00:14:43.703 "claim_type": "exclusive_write", 00:14:43.703 "zoned": false, 00:14:43.703 "supported_io_types": { 00:14:43.703 "read": true, 00:14:43.703 "write": true, 00:14:43.703 "unmap": true, 00:14:43.703 "flush": true, 00:14:43.703 "reset": true, 00:14:43.703 "nvme_admin": false, 00:14:43.703 "nvme_io": false, 00:14:43.703 "nvme_io_md": false, 00:14:43.703 "write_zeroes": true, 00:14:43.703 "zcopy": true, 00:14:43.703 "get_zone_info": false, 00:14:43.703 "zone_management": false, 00:14:43.703 "zone_append": false, 00:14:43.703 "compare": false, 00:14:43.703 "compare_and_write": false, 00:14:43.703 "abort": true, 00:14:43.703 "seek_hole": false, 00:14:43.703 "seek_data": false, 00:14:43.703 "copy": true, 00:14:43.703 "nvme_iov_md": false 00:14:43.703 }, 00:14:43.703 "memory_domains": [ 00:14:43.703 { 00:14:43.703 "dma_device_id": "system", 00:14:43.703 "dma_device_type": 1 00:14:43.703 }, 00:14:43.703 { 00:14:43.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.703 "dma_device_type": 2 00:14:43.703 } 00:14:43.703 ], 00:14:43.703 "driver_specific": {} 00:14:43.703 } 00:14:43.703 ] 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.703 17:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.965 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.965 "name": "Existed_Raid", 00:14:43.965 "uuid": "17abf702-c153-49a6-beaf-7a03fe224ce9", 00:14:43.965 "strip_size_kb": 0, 00:14:43.965 "state": "online", 00:14:43.965 "raid_level": "raid1", 00:14:43.965 "superblock": false, 00:14:43.965 "num_base_bdevs": 3, 00:14:43.965 "num_base_bdevs_discovered": 3, 00:14:43.965 "num_base_bdevs_operational": 3, 00:14:43.965 "base_bdevs_list": [ 00:14:43.965 { 00:14:43.965 "name": "BaseBdev1", 00:14:43.965 "uuid": "ad18d031-7754-4bef-8d27-f973058489e6", 00:14:43.965 "is_configured": true, 00:14:43.965 "data_offset": 0, 00:14:43.965 "data_size": 65536 00:14:43.965 }, 00:14:43.965 { 00:14:43.965 "name": "BaseBdev2", 00:14:43.965 "uuid": "5496149b-ffb3-47b0-87b5-2d536bc15059", 00:14:43.965 "is_configured": true, 00:14:43.965 "data_offset": 0, 00:14:43.965 "data_size": 65536 00:14:43.965 }, 00:14:43.965 { 00:14:43.965 "name": "BaseBdev3", 00:14:43.965 "uuid": "c90e8613-8125-40d3-a357-360f514ee93c", 00:14:43.965 "is_configured": true, 00:14:43.965 "data_offset": 0, 00:14:43.965 "data_size": 65536 00:14:43.965 } 00:14:43.965 ] 00:14:43.965 }' 00:14:43.965 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.965 17:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:44.537 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:44.537 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:44.537 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:44.537 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:44.537 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:44.537 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:44.537 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:44.537 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:44.798 [2024-07-15 17:26:55.876843] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:44.798 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:44.798 "name": "Existed_Raid", 00:14:44.798 "aliases": [ 00:14:44.798 "17abf702-c153-49a6-beaf-7a03fe224ce9" 00:14:44.798 ], 00:14:44.798 "product_name": "Raid Volume", 00:14:44.798 "block_size": 512, 00:14:44.798 "num_blocks": 65536, 00:14:44.798 "uuid": "17abf702-c153-49a6-beaf-7a03fe224ce9", 00:14:44.798 "assigned_rate_limits": { 00:14:44.798 "rw_ios_per_sec": 0, 00:14:44.798 "rw_mbytes_per_sec": 0, 00:14:44.798 "r_mbytes_per_sec": 0, 00:14:44.798 "w_mbytes_per_sec": 0 00:14:44.798 }, 00:14:44.798 "claimed": false, 00:14:44.798 "zoned": false, 00:14:44.798 "supported_io_types": { 00:14:44.798 "read": true, 00:14:44.798 "write": true, 00:14:44.798 "unmap": false, 00:14:44.798 "flush": false, 00:14:44.798 "reset": true, 00:14:44.798 "nvme_admin": false, 00:14:44.798 "nvme_io": false, 00:14:44.798 "nvme_io_md": false, 00:14:44.798 "write_zeroes": true, 00:14:44.798 "zcopy": false, 00:14:44.798 "get_zone_info": false, 00:14:44.798 "zone_management": false, 00:14:44.798 "zone_append": false, 00:14:44.798 "compare": false, 00:14:44.798 "compare_and_write": false, 00:14:44.798 "abort": false, 00:14:44.798 "seek_hole": false, 00:14:44.798 "seek_data": false, 00:14:44.798 "copy": false, 00:14:44.798 "nvme_iov_md": false 00:14:44.798 }, 00:14:44.798 "memory_domains": [ 00:14:44.798 { 00:14:44.798 "dma_device_id": "system", 00:14:44.798 "dma_device_type": 1 00:14:44.798 }, 00:14:44.798 { 00:14:44.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.798 "dma_device_type": 2 00:14:44.798 }, 00:14:44.798 { 00:14:44.798 "dma_device_id": "system", 00:14:44.798 "dma_device_type": 1 00:14:44.798 }, 00:14:44.798 { 00:14:44.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.798 "dma_device_type": 2 00:14:44.798 }, 00:14:44.798 { 00:14:44.798 "dma_device_id": "system", 00:14:44.798 "dma_device_type": 1 00:14:44.798 }, 00:14:44.798 { 00:14:44.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.798 "dma_device_type": 2 00:14:44.798 } 00:14:44.798 ], 00:14:44.798 "driver_specific": { 00:14:44.798 "raid": { 00:14:44.798 "uuid": "17abf702-c153-49a6-beaf-7a03fe224ce9", 00:14:44.798 "strip_size_kb": 0, 00:14:44.798 "state": "online", 00:14:44.798 "raid_level": "raid1", 00:14:44.798 "superblock": false, 00:14:44.798 "num_base_bdevs": 3, 00:14:44.798 "num_base_bdevs_discovered": 3, 00:14:44.798 "num_base_bdevs_operational": 3, 00:14:44.798 "base_bdevs_list": [ 00:14:44.798 { 00:14:44.798 "name": "BaseBdev1", 00:14:44.798 "uuid": "ad18d031-7754-4bef-8d27-f973058489e6", 00:14:44.798 "is_configured": true, 00:14:44.798 "data_offset": 0, 00:14:44.798 "data_size": 65536 00:14:44.798 }, 00:14:44.798 { 00:14:44.798 "name": "BaseBdev2", 00:14:44.798 "uuid": "5496149b-ffb3-47b0-87b5-2d536bc15059", 00:14:44.798 "is_configured": true, 00:14:44.798 "data_offset": 0, 00:14:44.798 "data_size": 65536 00:14:44.798 }, 00:14:44.798 { 00:14:44.798 "name": "BaseBdev3", 00:14:44.798 "uuid": "c90e8613-8125-40d3-a357-360f514ee93c", 00:14:44.798 "is_configured": true, 00:14:44.798 "data_offset": 0, 00:14:44.798 "data_size": 65536 00:14:44.798 } 00:14:44.798 ] 00:14:44.798 } 00:14:44.798 } 00:14:44.798 }' 00:14:44.798 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:44.798 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:44.798 BaseBdev2 00:14:44.798 BaseBdev3' 00:14:44.798 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.798 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:44.798 17:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:45.059 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:45.059 "name": "BaseBdev1", 00:14:45.059 "aliases": [ 00:14:45.059 "ad18d031-7754-4bef-8d27-f973058489e6" 00:14:45.059 ], 00:14:45.059 "product_name": "Malloc disk", 00:14:45.059 "block_size": 512, 00:14:45.059 "num_blocks": 65536, 00:14:45.059 "uuid": "ad18d031-7754-4bef-8d27-f973058489e6", 00:14:45.059 "assigned_rate_limits": { 00:14:45.059 "rw_ios_per_sec": 0, 00:14:45.059 "rw_mbytes_per_sec": 0, 00:14:45.059 "r_mbytes_per_sec": 0, 00:14:45.059 "w_mbytes_per_sec": 0 00:14:45.059 }, 00:14:45.059 "claimed": true, 00:14:45.059 "claim_type": "exclusive_write", 00:14:45.059 "zoned": false, 00:14:45.059 "supported_io_types": { 00:14:45.059 "read": true, 00:14:45.059 "write": true, 00:14:45.059 "unmap": true, 00:14:45.059 "flush": true, 00:14:45.059 "reset": true, 00:14:45.059 "nvme_admin": false, 00:14:45.059 "nvme_io": false, 00:14:45.059 "nvme_io_md": false, 00:14:45.059 "write_zeroes": true, 00:14:45.059 "zcopy": true, 00:14:45.059 "get_zone_info": false, 00:14:45.059 "zone_management": false, 00:14:45.059 "zone_append": false, 00:14:45.059 "compare": false, 00:14:45.059 "compare_and_write": false, 00:14:45.059 "abort": true, 00:14:45.059 "seek_hole": false, 00:14:45.059 "seek_data": false, 00:14:45.059 "copy": true, 00:14:45.059 "nvme_iov_md": false 00:14:45.059 }, 00:14:45.059 "memory_domains": [ 00:14:45.059 { 00:14:45.059 "dma_device_id": "system", 00:14:45.059 "dma_device_type": 1 00:14:45.059 }, 00:14:45.059 { 00:14:45.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.059 "dma_device_type": 2 00:14:45.059 } 00:14:45.059 ], 00:14:45.059 "driver_specific": {} 00:14:45.059 }' 00:14:45.059 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.059 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.059 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:45.059 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.059 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.059 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:45.059 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.320 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.320 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.320 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.320 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.320 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.320 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:45.320 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:45.320 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:45.579 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:45.579 "name": "BaseBdev2", 00:14:45.579 "aliases": [ 00:14:45.579 "5496149b-ffb3-47b0-87b5-2d536bc15059" 00:14:45.579 ], 00:14:45.579 "product_name": "Malloc disk", 00:14:45.579 "block_size": 512, 00:14:45.579 "num_blocks": 65536, 00:14:45.579 "uuid": "5496149b-ffb3-47b0-87b5-2d536bc15059", 00:14:45.579 "assigned_rate_limits": { 00:14:45.580 "rw_ios_per_sec": 0, 00:14:45.580 "rw_mbytes_per_sec": 0, 00:14:45.580 "r_mbytes_per_sec": 0, 00:14:45.580 "w_mbytes_per_sec": 0 00:14:45.580 }, 00:14:45.580 "claimed": true, 00:14:45.580 "claim_type": "exclusive_write", 00:14:45.580 "zoned": false, 00:14:45.580 "supported_io_types": { 00:14:45.580 "read": true, 00:14:45.580 "write": true, 00:14:45.580 "unmap": true, 00:14:45.580 "flush": true, 00:14:45.580 "reset": true, 00:14:45.580 "nvme_admin": false, 00:14:45.580 "nvme_io": false, 00:14:45.580 "nvme_io_md": false, 00:14:45.580 "write_zeroes": true, 00:14:45.580 "zcopy": true, 00:14:45.580 "get_zone_info": false, 00:14:45.580 "zone_management": false, 00:14:45.580 "zone_append": false, 00:14:45.580 "compare": false, 00:14:45.580 "compare_and_write": false, 00:14:45.580 "abort": true, 00:14:45.580 "seek_hole": false, 00:14:45.580 "seek_data": false, 00:14:45.580 "copy": true, 00:14:45.580 "nvme_iov_md": false 00:14:45.580 }, 00:14:45.580 "memory_domains": [ 00:14:45.580 { 00:14:45.580 "dma_device_id": "system", 00:14:45.580 "dma_device_type": 1 00:14:45.580 }, 00:14:45.580 { 00:14:45.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.580 "dma_device_type": 2 00:14:45.580 } 00:14:45.580 ], 00:14:45.580 "driver_specific": {} 00:14:45.580 }' 00:14:45.580 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.580 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.580 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:45.580 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.580 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.839 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:45.839 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.839 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.839 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.839 17:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.839 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.839 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.839 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:45.839 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:45.839 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:46.099 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:46.099 "name": "BaseBdev3", 00:14:46.099 "aliases": [ 00:14:46.099 "c90e8613-8125-40d3-a357-360f514ee93c" 00:14:46.099 ], 00:14:46.099 "product_name": "Malloc disk", 00:14:46.099 "block_size": 512, 00:14:46.099 "num_blocks": 65536, 00:14:46.099 "uuid": "c90e8613-8125-40d3-a357-360f514ee93c", 00:14:46.099 "assigned_rate_limits": { 00:14:46.099 "rw_ios_per_sec": 0, 00:14:46.099 "rw_mbytes_per_sec": 0, 00:14:46.099 "r_mbytes_per_sec": 0, 00:14:46.099 "w_mbytes_per_sec": 0 00:14:46.099 }, 00:14:46.099 "claimed": true, 00:14:46.099 "claim_type": "exclusive_write", 00:14:46.099 "zoned": false, 00:14:46.099 "supported_io_types": { 00:14:46.099 "read": true, 00:14:46.099 "write": true, 00:14:46.099 "unmap": true, 00:14:46.100 "flush": true, 00:14:46.100 "reset": true, 00:14:46.100 "nvme_admin": false, 00:14:46.100 "nvme_io": false, 00:14:46.100 "nvme_io_md": false, 00:14:46.100 "write_zeroes": true, 00:14:46.100 "zcopy": true, 00:14:46.100 "get_zone_info": false, 00:14:46.100 "zone_management": false, 00:14:46.100 "zone_append": false, 00:14:46.100 "compare": false, 00:14:46.100 "compare_and_write": false, 00:14:46.100 "abort": true, 00:14:46.100 "seek_hole": false, 00:14:46.100 "seek_data": false, 00:14:46.100 "copy": true, 00:14:46.100 "nvme_iov_md": false 00:14:46.100 }, 00:14:46.100 "memory_domains": [ 00:14:46.100 { 00:14:46.100 "dma_device_id": "system", 00:14:46.100 "dma_device_type": 1 00:14:46.100 }, 00:14:46.100 { 00:14:46.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.100 "dma_device_type": 2 00:14:46.100 } 00:14:46.100 ], 00:14:46.100 "driver_specific": {} 00:14:46.100 }' 00:14:46.100 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:46.100 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:46.100 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:46.100 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:46.100 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:46.361 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:46.361 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.361 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.361 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:46.361 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:46.361 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:46.361 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:46.361 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:46.622 [2024-07-15 17:26:57.741343] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.622 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.882 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.882 "name": "Existed_Raid", 00:14:46.882 "uuid": "17abf702-c153-49a6-beaf-7a03fe224ce9", 00:14:46.882 "strip_size_kb": 0, 00:14:46.882 "state": "online", 00:14:46.882 "raid_level": "raid1", 00:14:46.882 "superblock": false, 00:14:46.882 "num_base_bdevs": 3, 00:14:46.882 "num_base_bdevs_discovered": 2, 00:14:46.882 "num_base_bdevs_operational": 2, 00:14:46.882 "base_bdevs_list": [ 00:14:46.882 { 00:14:46.882 "name": null, 00:14:46.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.882 "is_configured": false, 00:14:46.882 "data_offset": 0, 00:14:46.882 "data_size": 65536 00:14:46.883 }, 00:14:46.883 { 00:14:46.883 "name": "BaseBdev2", 00:14:46.883 "uuid": "5496149b-ffb3-47b0-87b5-2d536bc15059", 00:14:46.883 "is_configured": true, 00:14:46.883 "data_offset": 0, 00:14:46.883 "data_size": 65536 00:14:46.883 }, 00:14:46.883 { 00:14:46.883 "name": "BaseBdev3", 00:14:46.883 "uuid": "c90e8613-8125-40d3-a357-360f514ee93c", 00:14:46.883 "is_configured": true, 00:14:46.883 "data_offset": 0, 00:14:46.883 "data_size": 65536 00:14:46.883 } 00:14:46.883 ] 00:14:46.883 }' 00:14:46.883 17:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.883 17:26:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.453 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:47.453 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:47.453 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.453 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:47.453 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:47.453 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:47.454 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:47.714 [2024-07-15 17:26:58.868222] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:47.714 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:47.714 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:47.714 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.714 17:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:47.974 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:47.974 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:47.974 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:47.974 [2024-07-15 17:26:59.255034] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:47.974 [2024-07-15 17:26:59.255093] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:47.974 [2024-07-15 17:26:59.261127] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:47.974 [2024-07-15 17:26:59.261154] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:47.974 [2024-07-15 17:26:59.261161] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2444e90 name Existed_Raid, state offline 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:48.234 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:48.494 BaseBdev2 00:14:48.494 17:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:48.494 17:26:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:48.494 17:26:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:48.494 17:26:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:48.494 17:26:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:48.494 17:26:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:48.494 17:26:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:48.754 17:26:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:48.754 [ 00:14:48.754 { 00:14:48.754 "name": "BaseBdev2", 00:14:48.754 "aliases": [ 00:14:48.754 "58acae6b-8602-4f6e-aa63-ce08b22be609" 00:14:48.754 ], 00:14:48.755 "product_name": "Malloc disk", 00:14:48.755 "block_size": 512, 00:14:48.755 "num_blocks": 65536, 00:14:48.755 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:48.755 "assigned_rate_limits": { 00:14:48.755 "rw_ios_per_sec": 0, 00:14:48.755 "rw_mbytes_per_sec": 0, 00:14:48.755 "r_mbytes_per_sec": 0, 00:14:48.755 "w_mbytes_per_sec": 0 00:14:48.755 }, 00:14:48.755 "claimed": false, 00:14:48.755 "zoned": false, 00:14:48.755 "supported_io_types": { 00:14:48.755 "read": true, 00:14:48.755 "write": true, 00:14:48.755 "unmap": true, 00:14:48.755 "flush": true, 00:14:48.755 "reset": true, 00:14:48.755 "nvme_admin": false, 00:14:48.755 "nvme_io": false, 00:14:48.755 "nvme_io_md": false, 00:14:48.755 "write_zeroes": true, 00:14:48.755 "zcopy": true, 00:14:48.755 "get_zone_info": false, 00:14:48.755 "zone_management": false, 00:14:48.755 "zone_append": false, 00:14:48.755 "compare": false, 00:14:48.755 "compare_and_write": false, 00:14:48.755 "abort": true, 00:14:48.755 "seek_hole": false, 00:14:48.755 "seek_data": false, 00:14:48.755 "copy": true, 00:14:48.755 "nvme_iov_md": false 00:14:48.755 }, 00:14:48.755 "memory_domains": [ 00:14:48.755 { 00:14:48.755 "dma_device_id": "system", 00:14:48.755 "dma_device_type": 1 00:14:48.755 }, 00:14:48.755 { 00:14:48.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.755 "dma_device_type": 2 00:14:48.755 } 00:14:48.755 ], 00:14:48.755 "driver_specific": {} 00:14:48.755 } 00:14:48.755 ] 00:14:48.755 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:48.755 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:48.755 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:48.755 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:49.014 BaseBdev3 00:14:49.014 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:49.014 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:49.014 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:49.014 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:49.014 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:49.014 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:49.014 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:49.275 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:49.536 [ 00:14:49.536 { 00:14:49.536 "name": "BaseBdev3", 00:14:49.536 "aliases": [ 00:14:49.536 "b9c25bd1-b1cd-4769-88e2-3907653a840b" 00:14:49.536 ], 00:14:49.536 "product_name": "Malloc disk", 00:14:49.536 "block_size": 512, 00:14:49.536 "num_blocks": 65536, 00:14:49.536 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:49.536 "assigned_rate_limits": { 00:14:49.536 "rw_ios_per_sec": 0, 00:14:49.536 "rw_mbytes_per_sec": 0, 00:14:49.536 "r_mbytes_per_sec": 0, 00:14:49.536 "w_mbytes_per_sec": 0 00:14:49.536 }, 00:14:49.536 "claimed": false, 00:14:49.536 "zoned": false, 00:14:49.536 "supported_io_types": { 00:14:49.536 "read": true, 00:14:49.536 "write": true, 00:14:49.536 "unmap": true, 00:14:49.536 "flush": true, 00:14:49.536 "reset": true, 00:14:49.536 "nvme_admin": false, 00:14:49.536 "nvme_io": false, 00:14:49.536 "nvme_io_md": false, 00:14:49.536 "write_zeroes": true, 00:14:49.536 "zcopy": true, 00:14:49.536 "get_zone_info": false, 00:14:49.536 "zone_management": false, 00:14:49.536 "zone_append": false, 00:14:49.536 "compare": false, 00:14:49.536 "compare_and_write": false, 00:14:49.536 "abort": true, 00:14:49.536 "seek_hole": false, 00:14:49.536 "seek_data": false, 00:14:49.536 "copy": true, 00:14:49.536 "nvme_iov_md": false 00:14:49.536 }, 00:14:49.536 "memory_domains": [ 00:14:49.536 { 00:14:49.536 "dma_device_id": "system", 00:14:49.536 "dma_device_type": 1 00:14:49.536 }, 00:14:49.536 { 00:14:49.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.536 "dma_device_type": 2 00:14:49.536 } 00:14:49.536 ], 00:14:49.536 "driver_specific": {} 00:14:49.536 } 00:14:49.536 ] 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:49.536 [2024-07-15 17:27:00.762706] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:49.536 [2024-07-15 17:27:00.762754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:49.536 [2024-07-15 17:27:00.762777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:49.536 [2024-07-15 17:27:00.763836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.536 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.796 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.796 "name": "Existed_Raid", 00:14:49.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.796 "strip_size_kb": 0, 00:14:49.796 "state": "configuring", 00:14:49.796 "raid_level": "raid1", 00:14:49.796 "superblock": false, 00:14:49.796 "num_base_bdevs": 3, 00:14:49.796 "num_base_bdevs_discovered": 2, 00:14:49.796 "num_base_bdevs_operational": 3, 00:14:49.796 "base_bdevs_list": [ 00:14:49.796 { 00:14:49.796 "name": "BaseBdev1", 00:14:49.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.796 "is_configured": false, 00:14:49.796 "data_offset": 0, 00:14:49.796 "data_size": 0 00:14:49.796 }, 00:14:49.796 { 00:14:49.796 "name": "BaseBdev2", 00:14:49.796 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:49.796 "is_configured": true, 00:14:49.796 "data_offset": 0, 00:14:49.796 "data_size": 65536 00:14:49.796 }, 00:14:49.796 { 00:14:49.796 "name": "BaseBdev3", 00:14:49.796 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:49.796 "is_configured": true, 00:14:49.796 "data_offset": 0, 00:14:49.796 "data_size": 65536 00:14:49.796 } 00:14:49.796 ] 00:14:49.796 }' 00:14:49.796 17:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.796 17:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.366 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:50.631 [2024-07-15 17:27:01.697063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.631 "name": "Existed_Raid", 00:14:50.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.631 "strip_size_kb": 0, 00:14:50.631 "state": "configuring", 00:14:50.631 "raid_level": "raid1", 00:14:50.631 "superblock": false, 00:14:50.631 "num_base_bdevs": 3, 00:14:50.631 "num_base_bdevs_discovered": 1, 00:14:50.631 "num_base_bdevs_operational": 3, 00:14:50.631 "base_bdevs_list": [ 00:14:50.631 { 00:14:50.631 "name": "BaseBdev1", 00:14:50.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.631 "is_configured": false, 00:14:50.631 "data_offset": 0, 00:14:50.631 "data_size": 0 00:14:50.631 }, 00:14:50.631 { 00:14:50.631 "name": null, 00:14:50.631 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:50.631 "is_configured": false, 00:14:50.631 "data_offset": 0, 00:14:50.631 "data_size": 65536 00:14:50.631 }, 00:14:50.631 { 00:14:50.631 "name": "BaseBdev3", 00:14:50.631 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:50.631 "is_configured": true, 00:14:50.631 "data_offset": 0, 00:14:50.631 "data_size": 65536 00:14:50.631 } 00:14:50.631 ] 00:14:50.631 }' 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.631 17:27:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.203 17:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.203 17:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:51.464 17:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:51.464 17:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:51.724 [2024-07-15 17:27:02.816868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:51.724 BaseBdev1 00:14:51.724 17:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:51.724 17:27:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:51.724 17:27:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:51.724 17:27:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:51.724 17:27:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:51.724 17:27:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:51.724 17:27:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:51.724 17:27:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:51.984 [ 00:14:51.984 { 00:14:51.984 "name": "BaseBdev1", 00:14:51.984 "aliases": [ 00:14:51.984 "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8" 00:14:51.984 ], 00:14:51.984 "product_name": "Malloc disk", 00:14:51.984 "block_size": 512, 00:14:51.984 "num_blocks": 65536, 00:14:51.984 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:51.984 "assigned_rate_limits": { 00:14:51.984 "rw_ios_per_sec": 0, 00:14:51.984 "rw_mbytes_per_sec": 0, 00:14:51.984 "r_mbytes_per_sec": 0, 00:14:51.984 "w_mbytes_per_sec": 0 00:14:51.984 }, 00:14:51.984 "claimed": true, 00:14:51.984 "claim_type": "exclusive_write", 00:14:51.984 "zoned": false, 00:14:51.984 "supported_io_types": { 00:14:51.984 "read": true, 00:14:51.984 "write": true, 00:14:51.984 "unmap": true, 00:14:51.984 "flush": true, 00:14:51.984 "reset": true, 00:14:51.984 "nvme_admin": false, 00:14:51.984 "nvme_io": false, 00:14:51.984 "nvme_io_md": false, 00:14:51.984 "write_zeroes": true, 00:14:51.984 "zcopy": true, 00:14:51.984 "get_zone_info": false, 00:14:51.984 "zone_management": false, 00:14:51.984 "zone_append": false, 00:14:51.984 "compare": false, 00:14:51.984 "compare_and_write": false, 00:14:51.984 "abort": true, 00:14:51.984 "seek_hole": false, 00:14:51.984 "seek_data": false, 00:14:51.984 "copy": true, 00:14:51.984 "nvme_iov_md": false 00:14:51.984 }, 00:14:51.984 "memory_domains": [ 00:14:51.984 { 00:14:51.984 "dma_device_id": "system", 00:14:51.984 "dma_device_type": 1 00:14:51.984 }, 00:14:51.984 { 00:14:51.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.984 "dma_device_type": 2 00:14:51.984 } 00:14:51.984 ], 00:14:51.984 "driver_specific": {} 00:14:51.984 } 00:14:51.984 ] 00:14:51.984 17:27:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:51.984 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:51.984 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:51.984 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:51.984 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:51.985 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:51.985 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:51.985 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.985 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.985 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.985 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.985 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.985 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.244 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.244 "name": "Existed_Raid", 00:14:52.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.244 "strip_size_kb": 0, 00:14:52.244 "state": "configuring", 00:14:52.244 "raid_level": "raid1", 00:14:52.244 "superblock": false, 00:14:52.244 "num_base_bdevs": 3, 00:14:52.244 "num_base_bdevs_discovered": 2, 00:14:52.244 "num_base_bdevs_operational": 3, 00:14:52.244 "base_bdevs_list": [ 00:14:52.244 { 00:14:52.244 "name": "BaseBdev1", 00:14:52.244 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:52.244 "is_configured": true, 00:14:52.244 "data_offset": 0, 00:14:52.244 "data_size": 65536 00:14:52.244 }, 00:14:52.244 { 00:14:52.244 "name": null, 00:14:52.244 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:52.244 "is_configured": false, 00:14:52.244 "data_offset": 0, 00:14:52.244 "data_size": 65536 00:14:52.244 }, 00:14:52.244 { 00:14:52.244 "name": "BaseBdev3", 00:14:52.244 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:52.244 "is_configured": true, 00:14:52.244 "data_offset": 0, 00:14:52.244 "data_size": 65536 00:14:52.244 } 00:14:52.244 ] 00:14:52.244 }' 00:14:52.244 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.244 17:27:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.817 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.817 17:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:52.817 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:52.817 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:53.076 [2024-07-15 17:27:04.288613] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.076 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.337 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.337 "name": "Existed_Raid", 00:14:53.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.337 "strip_size_kb": 0, 00:14:53.337 "state": "configuring", 00:14:53.337 "raid_level": "raid1", 00:14:53.337 "superblock": false, 00:14:53.337 "num_base_bdevs": 3, 00:14:53.337 "num_base_bdevs_discovered": 1, 00:14:53.337 "num_base_bdevs_operational": 3, 00:14:53.337 "base_bdevs_list": [ 00:14:53.337 { 00:14:53.337 "name": "BaseBdev1", 00:14:53.337 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:53.337 "is_configured": true, 00:14:53.337 "data_offset": 0, 00:14:53.337 "data_size": 65536 00:14:53.337 }, 00:14:53.337 { 00:14:53.337 "name": null, 00:14:53.337 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:53.337 "is_configured": false, 00:14:53.337 "data_offset": 0, 00:14:53.337 "data_size": 65536 00:14:53.337 }, 00:14:53.337 { 00:14:53.337 "name": null, 00:14:53.337 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:53.337 "is_configured": false, 00:14:53.337 "data_offset": 0, 00:14:53.337 "data_size": 65536 00:14:53.337 } 00:14:53.337 ] 00:14:53.337 }' 00:14:53.337 17:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.337 17:27:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.907 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.907 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:54.167 [2024-07-15 17:27:05.407456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.167 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.456 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.456 "name": "Existed_Raid", 00:14:54.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.456 "strip_size_kb": 0, 00:14:54.456 "state": "configuring", 00:14:54.456 "raid_level": "raid1", 00:14:54.456 "superblock": false, 00:14:54.456 "num_base_bdevs": 3, 00:14:54.456 "num_base_bdevs_discovered": 2, 00:14:54.456 "num_base_bdevs_operational": 3, 00:14:54.456 "base_bdevs_list": [ 00:14:54.456 { 00:14:54.456 "name": "BaseBdev1", 00:14:54.456 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:54.456 "is_configured": true, 00:14:54.456 "data_offset": 0, 00:14:54.456 "data_size": 65536 00:14:54.456 }, 00:14:54.456 { 00:14:54.456 "name": null, 00:14:54.456 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:54.456 "is_configured": false, 00:14:54.456 "data_offset": 0, 00:14:54.456 "data_size": 65536 00:14:54.456 }, 00:14:54.456 { 00:14:54.456 "name": "BaseBdev3", 00:14:54.456 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:54.456 "is_configured": true, 00:14:54.456 "data_offset": 0, 00:14:54.456 "data_size": 65536 00:14:54.456 } 00:14:54.456 ] 00:14:54.456 }' 00:14:54.456 17:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.456 17:27:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.026 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.026 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:55.286 [2024-07-15 17:27:06.514264] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.286 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:55.546 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.546 "name": "Existed_Raid", 00:14:55.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.546 "strip_size_kb": 0, 00:14:55.546 "state": "configuring", 00:14:55.546 "raid_level": "raid1", 00:14:55.546 "superblock": false, 00:14:55.546 "num_base_bdevs": 3, 00:14:55.546 "num_base_bdevs_discovered": 1, 00:14:55.546 "num_base_bdevs_operational": 3, 00:14:55.546 "base_bdevs_list": [ 00:14:55.546 { 00:14:55.546 "name": null, 00:14:55.546 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:55.546 "is_configured": false, 00:14:55.546 "data_offset": 0, 00:14:55.546 "data_size": 65536 00:14:55.546 }, 00:14:55.546 { 00:14:55.546 "name": null, 00:14:55.546 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:55.546 "is_configured": false, 00:14:55.546 "data_offset": 0, 00:14:55.546 "data_size": 65536 00:14:55.546 }, 00:14:55.546 { 00:14:55.546 "name": "BaseBdev3", 00:14:55.546 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:55.546 "is_configured": true, 00:14:55.546 "data_offset": 0, 00:14:55.546 "data_size": 65536 00:14:55.546 } 00:14:55.546 ] 00:14:55.546 }' 00:14:55.546 17:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.546 17:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.113 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.113 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:56.416 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:56.416 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:56.416 [2024-07-15 17:27:07.711100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.675 "name": "Existed_Raid", 00:14:56.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.675 "strip_size_kb": 0, 00:14:56.675 "state": "configuring", 00:14:56.675 "raid_level": "raid1", 00:14:56.675 "superblock": false, 00:14:56.675 "num_base_bdevs": 3, 00:14:56.675 "num_base_bdevs_discovered": 2, 00:14:56.675 "num_base_bdevs_operational": 3, 00:14:56.675 "base_bdevs_list": [ 00:14:56.675 { 00:14:56.675 "name": null, 00:14:56.675 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:56.675 "is_configured": false, 00:14:56.675 "data_offset": 0, 00:14:56.675 "data_size": 65536 00:14:56.675 }, 00:14:56.675 { 00:14:56.675 "name": "BaseBdev2", 00:14:56.675 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:56.675 "is_configured": true, 00:14:56.675 "data_offset": 0, 00:14:56.675 "data_size": 65536 00:14:56.675 }, 00:14:56.675 { 00:14:56.675 "name": "BaseBdev3", 00:14:56.675 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:56.675 "is_configured": true, 00:14:56.675 "data_offset": 0, 00:14:56.675 "data_size": 65536 00:14:56.675 } 00:14:56.675 ] 00:14:56.675 }' 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.675 17:27:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.243 17:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.243 17:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:57.516 17:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:57.516 17:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.516 17:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:57.807 17:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u fcb0b706-8b6d-4c6e-904f-5104eda8d5b8 00:14:57.807 [2024-07-15 17:27:09.071545] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:57.807 [2024-07-15 17:27:09.071573] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2446630 00:14:57.807 [2024-07-15 17:27:09.071578] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:14:57.807 [2024-07-15 17:27:09.071738] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24446a0 00:14:57.807 [2024-07-15 17:27:09.071836] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2446630 00:14:57.807 [2024-07-15 17:27:09.071842] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2446630 00:14:57.807 [2024-07-15 17:27:09.071964] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:57.807 NewBaseBdev 00:14:57.807 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:57.807 17:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:57.807 17:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:57.807 17:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:57.807 17:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:57.807 17:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:57.807 17:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:58.067 17:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:58.327 [ 00:14:58.327 { 00:14:58.327 "name": "NewBaseBdev", 00:14:58.327 "aliases": [ 00:14:58.327 "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8" 00:14:58.327 ], 00:14:58.327 "product_name": "Malloc disk", 00:14:58.327 "block_size": 512, 00:14:58.327 "num_blocks": 65536, 00:14:58.327 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:58.327 "assigned_rate_limits": { 00:14:58.327 "rw_ios_per_sec": 0, 00:14:58.327 "rw_mbytes_per_sec": 0, 00:14:58.327 "r_mbytes_per_sec": 0, 00:14:58.327 "w_mbytes_per_sec": 0 00:14:58.327 }, 00:14:58.327 "claimed": true, 00:14:58.327 "claim_type": "exclusive_write", 00:14:58.327 "zoned": false, 00:14:58.327 "supported_io_types": { 00:14:58.327 "read": true, 00:14:58.327 "write": true, 00:14:58.327 "unmap": true, 00:14:58.327 "flush": true, 00:14:58.327 "reset": true, 00:14:58.327 "nvme_admin": false, 00:14:58.327 "nvme_io": false, 00:14:58.327 "nvme_io_md": false, 00:14:58.327 "write_zeroes": true, 00:14:58.327 "zcopy": true, 00:14:58.327 "get_zone_info": false, 00:14:58.327 "zone_management": false, 00:14:58.327 "zone_append": false, 00:14:58.327 "compare": false, 00:14:58.327 "compare_and_write": false, 00:14:58.327 "abort": true, 00:14:58.327 "seek_hole": false, 00:14:58.327 "seek_data": false, 00:14:58.327 "copy": true, 00:14:58.327 "nvme_iov_md": false 00:14:58.327 }, 00:14:58.327 "memory_domains": [ 00:14:58.327 { 00:14:58.327 "dma_device_id": "system", 00:14:58.327 "dma_device_type": 1 00:14:58.327 }, 00:14:58.327 { 00:14:58.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.327 "dma_device_type": 2 00:14:58.327 } 00:14:58.327 ], 00:14:58.327 "driver_specific": {} 00:14:58.327 } 00:14:58.327 ] 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.327 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.587 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.587 "name": "Existed_Raid", 00:14:58.587 "uuid": "6f2df23e-3583-4fd1-abae-5f5b7195e87a", 00:14:58.587 "strip_size_kb": 0, 00:14:58.587 "state": "online", 00:14:58.587 "raid_level": "raid1", 00:14:58.587 "superblock": false, 00:14:58.587 "num_base_bdevs": 3, 00:14:58.587 "num_base_bdevs_discovered": 3, 00:14:58.587 "num_base_bdevs_operational": 3, 00:14:58.587 "base_bdevs_list": [ 00:14:58.587 { 00:14:58.587 "name": "NewBaseBdev", 00:14:58.587 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:58.587 "is_configured": true, 00:14:58.587 "data_offset": 0, 00:14:58.587 "data_size": 65536 00:14:58.587 }, 00:14:58.587 { 00:14:58.587 "name": "BaseBdev2", 00:14:58.587 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:58.587 "is_configured": true, 00:14:58.587 "data_offset": 0, 00:14:58.587 "data_size": 65536 00:14:58.587 }, 00:14:58.587 { 00:14:58.587 "name": "BaseBdev3", 00:14:58.587 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:58.587 "is_configured": true, 00:14:58.587 "data_offset": 0, 00:14:58.587 "data_size": 65536 00:14:58.587 } 00:14:58.587 ] 00:14:58.587 }' 00:14:58.587 17:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.587 17:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.158 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:59.158 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:59.158 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:59.158 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:59.158 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:59.158 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:59.158 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:59.158 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:59.158 [2024-07-15 17:27:10.427232] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:59.158 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:59.158 "name": "Existed_Raid", 00:14:59.158 "aliases": [ 00:14:59.158 "6f2df23e-3583-4fd1-abae-5f5b7195e87a" 00:14:59.158 ], 00:14:59.158 "product_name": "Raid Volume", 00:14:59.158 "block_size": 512, 00:14:59.158 "num_blocks": 65536, 00:14:59.158 "uuid": "6f2df23e-3583-4fd1-abae-5f5b7195e87a", 00:14:59.158 "assigned_rate_limits": { 00:14:59.158 "rw_ios_per_sec": 0, 00:14:59.158 "rw_mbytes_per_sec": 0, 00:14:59.158 "r_mbytes_per_sec": 0, 00:14:59.158 "w_mbytes_per_sec": 0 00:14:59.158 }, 00:14:59.158 "claimed": false, 00:14:59.158 "zoned": false, 00:14:59.158 "supported_io_types": { 00:14:59.158 "read": true, 00:14:59.158 "write": true, 00:14:59.158 "unmap": false, 00:14:59.158 "flush": false, 00:14:59.158 "reset": true, 00:14:59.158 "nvme_admin": false, 00:14:59.158 "nvme_io": false, 00:14:59.158 "nvme_io_md": false, 00:14:59.158 "write_zeroes": true, 00:14:59.158 "zcopy": false, 00:14:59.158 "get_zone_info": false, 00:14:59.158 "zone_management": false, 00:14:59.158 "zone_append": false, 00:14:59.158 "compare": false, 00:14:59.158 "compare_and_write": false, 00:14:59.158 "abort": false, 00:14:59.158 "seek_hole": false, 00:14:59.158 "seek_data": false, 00:14:59.158 "copy": false, 00:14:59.158 "nvme_iov_md": false 00:14:59.158 }, 00:14:59.158 "memory_domains": [ 00:14:59.158 { 00:14:59.158 "dma_device_id": "system", 00:14:59.158 "dma_device_type": 1 00:14:59.158 }, 00:14:59.158 { 00:14:59.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.158 "dma_device_type": 2 00:14:59.158 }, 00:14:59.158 { 00:14:59.158 "dma_device_id": "system", 00:14:59.158 "dma_device_type": 1 00:14:59.158 }, 00:14:59.158 { 00:14:59.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.158 "dma_device_type": 2 00:14:59.158 }, 00:14:59.158 { 00:14:59.158 "dma_device_id": "system", 00:14:59.158 "dma_device_type": 1 00:14:59.158 }, 00:14:59.158 { 00:14:59.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.158 "dma_device_type": 2 00:14:59.158 } 00:14:59.158 ], 00:14:59.158 "driver_specific": { 00:14:59.158 "raid": { 00:14:59.158 "uuid": "6f2df23e-3583-4fd1-abae-5f5b7195e87a", 00:14:59.158 "strip_size_kb": 0, 00:14:59.158 "state": "online", 00:14:59.158 "raid_level": "raid1", 00:14:59.158 "superblock": false, 00:14:59.158 "num_base_bdevs": 3, 00:14:59.158 "num_base_bdevs_discovered": 3, 00:14:59.158 "num_base_bdevs_operational": 3, 00:14:59.158 "base_bdevs_list": [ 00:14:59.158 { 00:14:59.158 "name": "NewBaseBdev", 00:14:59.158 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:59.158 "is_configured": true, 00:14:59.158 "data_offset": 0, 00:14:59.158 "data_size": 65536 00:14:59.158 }, 00:14:59.158 { 00:14:59.158 "name": "BaseBdev2", 00:14:59.158 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:14:59.158 "is_configured": true, 00:14:59.158 "data_offset": 0, 00:14:59.159 "data_size": 65536 00:14:59.159 }, 00:14:59.159 { 00:14:59.159 "name": "BaseBdev3", 00:14:59.159 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:14:59.159 "is_configured": true, 00:14:59.159 "data_offset": 0, 00:14:59.159 "data_size": 65536 00:14:59.159 } 00:14:59.159 ] 00:14:59.159 } 00:14:59.159 } 00:14:59.159 }' 00:14:59.159 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:59.419 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:59.419 BaseBdev2 00:14:59.419 BaseBdev3' 00:14:59.419 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:59.419 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:59.419 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:59.419 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:59.419 "name": "NewBaseBdev", 00:14:59.419 "aliases": [ 00:14:59.419 "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8" 00:14:59.419 ], 00:14:59.419 "product_name": "Malloc disk", 00:14:59.419 "block_size": 512, 00:14:59.419 "num_blocks": 65536, 00:14:59.419 "uuid": "fcb0b706-8b6d-4c6e-904f-5104eda8d5b8", 00:14:59.419 "assigned_rate_limits": { 00:14:59.419 "rw_ios_per_sec": 0, 00:14:59.419 "rw_mbytes_per_sec": 0, 00:14:59.419 "r_mbytes_per_sec": 0, 00:14:59.419 "w_mbytes_per_sec": 0 00:14:59.419 }, 00:14:59.419 "claimed": true, 00:14:59.419 "claim_type": "exclusive_write", 00:14:59.419 "zoned": false, 00:14:59.419 "supported_io_types": { 00:14:59.419 "read": true, 00:14:59.419 "write": true, 00:14:59.419 "unmap": true, 00:14:59.419 "flush": true, 00:14:59.419 "reset": true, 00:14:59.419 "nvme_admin": false, 00:14:59.419 "nvme_io": false, 00:14:59.419 "nvme_io_md": false, 00:14:59.419 "write_zeroes": true, 00:14:59.419 "zcopy": true, 00:14:59.419 "get_zone_info": false, 00:14:59.419 "zone_management": false, 00:14:59.419 "zone_append": false, 00:14:59.419 "compare": false, 00:14:59.419 "compare_and_write": false, 00:14:59.419 "abort": true, 00:14:59.419 "seek_hole": false, 00:14:59.419 "seek_data": false, 00:14:59.419 "copy": true, 00:14:59.419 "nvme_iov_md": false 00:14:59.419 }, 00:14:59.419 "memory_domains": [ 00:14:59.419 { 00:14:59.419 "dma_device_id": "system", 00:14:59.419 "dma_device_type": 1 00:14:59.419 }, 00:14:59.419 { 00:14:59.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.419 "dma_device_type": 2 00:14:59.419 } 00:14:59.419 ], 00:14:59.419 "driver_specific": {} 00:14:59.419 }' 00:14:59.419 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.679 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.679 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:59.679 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.679 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.679 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:59.679 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.679 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.679 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.679 17:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.940 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.940 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.940 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:59.940 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:59.940 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:00.199 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:00.199 "name": "BaseBdev2", 00:15:00.199 "aliases": [ 00:15:00.199 "58acae6b-8602-4f6e-aa63-ce08b22be609" 00:15:00.199 ], 00:15:00.199 "product_name": "Malloc disk", 00:15:00.199 "block_size": 512, 00:15:00.199 "num_blocks": 65536, 00:15:00.199 "uuid": "58acae6b-8602-4f6e-aa63-ce08b22be609", 00:15:00.199 "assigned_rate_limits": { 00:15:00.199 "rw_ios_per_sec": 0, 00:15:00.199 "rw_mbytes_per_sec": 0, 00:15:00.199 "r_mbytes_per_sec": 0, 00:15:00.199 "w_mbytes_per_sec": 0 00:15:00.199 }, 00:15:00.199 "claimed": true, 00:15:00.199 "claim_type": "exclusive_write", 00:15:00.199 "zoned": false, 00:15:00.199 "supported_io_types": { 00:15:00.199 "read": true, 00:15:00.199 "write": true, 00:15:00.199 "unmap": true, 00:15:00.199 "flush": true, 00:15:00.199 "reset": true, 00:15:00.199 "nvme_admin": false, 00:15:00.199 "nvme_io": false, 00:15:00.199 "nvme_io_md": false, 00:15:00.199 "write_zeroes": true, 00:15:00.199 "zcopy": true, 00:15:00.199 "get_zone_info": false, 00:15:00.199 "zone_management": false, 00:15:00.199 "zone_append": false, 00:15:00.199 "compare": false, 00:15:00.199 "compare_and_write": false, 00:15:00.199 "abort": true, 00:15:00.199 "seek_hole": false, 00:15:00.199 "seek_data": false, 00:15:00.199 "copy": true, 00:15:00.199 "nvme_iov_md": false 00:15:00.199 }, 00:15:00.199 "memory_domains": [ 00:15:00.199 { 00:15:00.199 "dma_device_id": "system", 00:15:00.199 "dma_device_type": 1 00:15:00.199 }, 00:15:00.199 { 00:15:00.199 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.199 "dma_device_type": 2 00:15:00.199 } 00:15:00.199 ], 00:15:00.199 "driver_specific": {} 00:15:00.199 }' 00:15:00.199 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.199 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.199 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:00.199 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:00.199 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:00.200 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:00.200 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:00.459 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:00.459 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:00.459 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.459 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.459 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:00.459 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:00.459 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:00.459 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:00.719 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:00.719 "name": "BaseBdev3", 00:15:00.719 "aliases": [ 00:15:00.719 "b9c25bd1-b1cd-4769-88e2-3907653a840b" 00:15:00.719 ], 00:15:00.719 "product_name": "Malloc disk", 00:15:00.719 "block_size": 512, 00:15:00.719 "num_blocks": 65536, 00:15:00.719 "uuid": "b9c25bd1-b1cd-4769-88e2-3907653a840b", 00:15:00.719 "assigned_rate_limits": { 00:15:00.719 "rw_ios_per_sec": 0, 00:15:00.719 "rw_mbytes_per_sec": 0, 00:15:00.719 "r_mbytes_per_sec": 0, 00:15:00.719 "w_mbytes_per_sec": 0 00:15:00.719 }, 00:15:00.719 "claimed": true, 00:15:00.719 "claim_type": "exclusive_write", 00:15:00.719 "zoned": false, 00:15:00.719 "supported_io_types": { 00:15:00.719 "read": true, 00:15:00.719 "write": true, 00:15:00.719 "unmap": true, 00:15:00.719 "flush": true, 00:15:00.719 "reset": true, 00:15:00.719 "nvme_admin": false, 00:15:00.719 "nvme_io": false, 00:15:00.719 "nvme_io_md": false, 00:15:00.719 "write_zeroes": true, 00:15:00.719 "zcopy": true, 00:15:00.719 "get_zone_info": false, 00:15:00.719 "zone_management": false, 00:15:00.719 "zone_append": false, 00:15:00.719 "compare": false, 00:15:00.719 "compare_and_write": false, 00:15:00.719 "abort": true, 00:15:00.719 "seek_hole": false, 00:15:00.719 "seek_data": false, 00:15:00.719 "copy": true, 00:15:00.719 "nvme_iov_md": false 00:15:00.719 }, 00:15:00.719 "memory_domains": [ 00:15:00.719 { 00:15:00.719 "dma_device_id": "system", 00:15:00.719 "dma_device_type": 1 00:15:00.719 }, 00:15:00.719 { 00:15:00.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.719 "dma_device_type": 2 00:15:00.719 } 00:15:00.719 ], 00:15:00.719 "driver_specific": {} 00:15:00.719 }' 00:15:00.719 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.719 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.719 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:00.719 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:00.720 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:00.720 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:00.720 17:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:00.979 17:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:00.979 17:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:00.979 17:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.979 17:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.979 17:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:00.979 17:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:01.239 [2024-07-15 17:27:12.307773] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:01.239 [2024-07-15 17:27:12.307794] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:01.239 [2024-07-15 17:27:12.307832] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:01.239 [2024-07-15 17:27:12.308034] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:01.239 [2024-07-15 17:27:12.308041] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2446630 name Existed_Raid, state offline 00:15:01.239 17:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2790621 00:15:01.239 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2790621 ']' 00:15:01.239 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2790621 00:15:01.239 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:01.239 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:01.239 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2790621 00:15:01.239 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:01.239 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:01.239 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2790621' 00:15:01.240 killing process with pid 2790621 00:15:01.240 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2790621 00:15:01.240 [2024-07-15 17:27:12.380729] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:01.240 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2790621 00:15:01.240 [2024-07-15 17:27:12.395657] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:01.240 17:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:01.240 00:15:01.240 real 0m23.959s 00:15:01.240 user 0m44.963s 00:15:01.240 sys 0m3.505s 00:15:01.240 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:01.240 17:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.240 ************************************ 00:15:01.240 END TEST raid_state_function_test 00:15:01.240 ************************************ 00:15:01.500 17:27:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:01.500 17:27:12 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:15:01.500 17:27:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:01.500 17:27:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:01.500 17:27:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:01.500 ************************************ 00:15:01.500 START TEST raid_state_function_test_sb 00:15:01.500 ************************************ 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:01.500 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2795872 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2795872' 00:15:01.501 Process raid pid: 2795872 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2795872 /var/tmp/spdk-raid.sock 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2795872 ']' 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:01.501 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:01.501 17:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:01.501 [2024-07-15 17:27:12.657605] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:15:01.501 [2024-07-15 17:27:12.657662] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:01.501 [2024-07-15 17:27:12.751561] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:01.761 [2024-07-15 17:27:12.828212] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:01.761 [2024-07-15 17:27:12.875044] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:01.761 [2024-07-15 17:27:12.875068] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:02.329 17:27:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:02.329 17:27:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:02.329 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:02.589 [2024-07-15 17:27:13.682894] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:02.589 [2024-07-15 17:27:13.682926] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:02.589 [2024-07-15 17:27:13.682932] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:02.589 [2024-07-15 17:27:13.682938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:02.589 [2024-07-15 17:27:13.682942] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:02.589 [2024-07-15 17:27:13.682948] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.589 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:02.848 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:02.848 "name": "Existed_Raid", 00:15:02.848 "uuid": "d204cef8-3d02-4363-886f-1364c7ce62a7", 00:15:02.848 "strip_size_kb": 0, 00:15:02.848 "state": "configuring", 00:15:02.848 "raid_level": "raid1", 00:15:02.848 "superblock": true, 00:15:02.848 "num_base_bdevs": 3, 00:15:02.848 "num_base_bdevs_discovered": 0, 00:15:02.848 "num_base_bdevs_operational": 3, 00:15:02.848 "base_bdevs_list": [ 00:15:02.848 { 00:15:02.848 "name": "BaseBdev1", 00:15:02.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:02.848 "is_configured": false, 00:15:02.848 "data_offset": 0, 00:15:02.848 "data_size": 0 00:15:02.848 }, 00:15:02.848 { 00:15:02.848 "name": "BaseBdev2", 00:15:02.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:02.848 "is_configured": false, 00:15:02.848 "data_offset": 0, 00:15:02.848 "data_size": 0 00:15:02.848 }, 00:15:02.848 { 00:15:02.848 "name": "BaseBdev3", 00:15:02.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:02.848 "is_configured": false, 00:15:02.848 "data_offset": 0, 00:15:02.848 "data_size": 0 00:15:02.848 } 00:15:02.849 ] 00:15:02.849 }' 00:15:02.849 17:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:02.849 17:27:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:03.417 17:27:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:03.417 [2024-07-15 17:27:14.597092] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:03.417 [2024-07-15 17:27:14.597109] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e6e6d0 name Existed_Raid, state configuring 00:15:03.417 17:27:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:03.677 [2024-07-15 17:27:14.785589] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:03.677 [2024-07-15 17:27:14.785608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:03.677 [2024-07-15 17:27:14.785613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:03.677 [2024-07-15 17:27:14.785618] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:03.677 [2024-07-15 17:27:14.785623] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:03.677 [2024-07-15 17:27:14.785628] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:03.677 17:27:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:03.939 [2024-07-15 17:27:14.984793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:03.939 BaseBdev1 00:15:03.939 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:03.939 17:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:03.939 17:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:03.939 17:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:03.939 17:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:03.939 17:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:03.939 17:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:03.939 17:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:04.199 [ 00:15:04.199 { 00:15:04.199 "name": "BaseBdev1", 00:15:04.199 "aliases": [ 00:15:04.200 "f8340761-2738-4072-a0b1-97935157eaff" 00:15:04.200 ], 00:15:04.200 "product_name": "Malloc disk", 00:15:04.200 "block_size": 512, 00:15:04.200 "num_blocks": 65536, 00:15:04.200 "uuid": "f8340761-2738-4072-a0b1-97935157eaff", 00:15:04.200 "assigned_rate_limits": { 00:15:04.200 "rw_ios_per_sec": 0, 00:15:04.200 "rw_mbytes_per_sec": 0, 00:15:04.200 "r_mbytes_per_sec": 0, 00:15:04.200 "w_mbytes_per_sec": 0 00:15:04.200 }, 00:15:04.200 "claimed": true, 00:15:04.200 "claim_type": "exclusive_write", 00:15:04.200 "zoned": false, 00:15:04.200 "supported_io_types": { 00:15:04.200 "read": true, 00:15:04.200 "write": true, 00:15:04.200 "unmap": true, 00:15:04.200 "flush": true, 00:15:04.200 "reset": true, 00:15:04.200 "nvme_admin": false, 00:15:04.200 "nvme_io": false, 00:15:04.200 "nvme_io_md": false, 00:15:04.200 "write_zeroes": true, 00:15:04.200 "zcopy": true, 00:15:04.200 "get_zone_info": false, 00:15:04.200 "zone_management": false, 00:15:04.200 "zone_append": false, 00:15:04.200 "compare": false, 00:15:04.200 "compare_and_write": false, 00:15:04.200 "abort": true, 00:15:04.200 "seek_hole": false, 00:15:04.200 "seek_data": false, 00:15:04.200 "copy": true, 00:15:04.200 "nvme_iov_md": false 00:15:04.200 }, 00:15:04.200 "memory_domains": [ 00:15:04.200 { 00:15:04.200 "dma_device_id": "system", 00:15:04.200 "dma_device_type": 1 00:15:04.200 }, 00:15:04.200 { 00:15:04.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.200 "dma_device_type": 2 00:15:04.200 } 00:15:04.200 ], 00:15:04.200 "driver_specific": {} 00:15:04.200 } 00:15:04.200 ] 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.200 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:04.461 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.461 "name": "Existed_Raid", 00:15:04.461 "uuid": "95624bc2-8386-49d8-a7fc-f07d7339b329", 00:15:04.461 "strip_size_kb": 0, 00:15:04.461 "state": "configuring", 00:15:04.461 "raid_level": "raid1", 00:15:04.461 "superblock": true, 00:15:04.461 "num_base_bdevs": 3, 00:15:04.461 "num_base_bdevs_discovered": 1, 00:15:04.461 "num_base_bdevs_operational": 3, 00:15:04.461 "base_bdevs_list": [ 00:15:04.461 { 00:15:04.461 "name": "BaseBdev1", 00:15:04.461 "uuid": "f8340761-2738-4072-a0b1-97935157eaff", 00:15:04.461 "is_configured": true, 00:15:04.461 "data_offset": 2048, 00:15:04.461 "data_size": 63488 00:15:04.461 }, 00:15:04.461 { 00:15:04.461 "name": "BaseBdev2", 00:15:04.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.461 "is_configured": false, 00:15:04.461 "data_offset": 0, 00:15:04.461 "data_size": 0 00:15:04.461 }, 00:15:04.461 { 00:15:04.461 "name": "BaseBdev3", 00:15:04.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.461 "is_configured": false, 00:15:04.461 "data_offset": 0, 00:15:04.461 "data_size": 0 00:15:04.461 } 00:15:04.461 ] 00:15:04.461 }' 00:15:04.461 17:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.461 17:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:05.032 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:05.032 [2024-07-15 17:27:16.284075] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:05.032 [2024-07-15 17:27:16.284104] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e6dfa0 name Existed_Raid, state configuring 00:15:05.032 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:05.293 [2024-07-15 17:27:16.480599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:05.293 [2024-07-15 17:27:16.481751] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:05.293 [2024-07-15 17:27:16.481776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:05.293 [2024-07-15 17:27:16.481782] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:05.293 [2024-07-15 17:27:16.481788] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.293 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:05.553 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:05.553 "name": "Existed_Raid", 00:15:05.553 "uuid": "4596b9a1-2e94-47a6-ac54-071eb78445ab", 00:15:05.553 "strip_size_kb": 0, 00:15:05.553 "state": "configuring", 00:15:05.553 "raid_level": "raid1", 00:15:05.553 "superblock": true, 00:15:05.553 "num_base_bdevs": 3, 00:15:05.553 "num_base_bdevs_discovered": 1, 00:15:05.553 "num_base_bdevs_operational": 3, 00:15:05.553 "base_bdevs_list": [ 00:15:05.553 { 00:15:05.553 "name": "BaseBdev1", 00:15:05.553 "uuid": "f8340761-2738-4072-a0b1-97935157eaff", 00:15:05.553 "is_configured": true, 00:15:05.553 "data_offset": 2048, 00:15:05.553 "data_size": 63488 00:15:05.553 }, 00:15:05.553 { 00:15:05.553 "name": "BaseBdev2", 00:15:05.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:05.553 "is_configured": false, 00:15:05.553 "data_offset": 0, 00:15:05.553 "data_size": 0 00:15:05.553 }, 00:15:05.553 { 00:15:05.553 "name": "BaseBdev3", 00:15:05.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:05.553 "is_configured": false, 00:15:05.553 "data_offset": 0, 00:15:05.553 "data_size": 0 00:15:05.553 } 00:15:05.553 ] 00:15:05.553 }' 00:15:05.553 17:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:05.553 17:27:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:06.122 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:06.382 [2024-07-15 17:27:17.435898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:06.382 BaseBdev2 00:15:06.382 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:06.382 17:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:06.382 17:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:06.382 17:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:06.382 17:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:06.382 17:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:06.382 17:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:06.382 17:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:06.642 [ 00:15:06.642 { 00:15:06.642 "name": "BaseBdev2", 00:15:06.642 "aliases": [ 00:15:06.642 "84191000-e884-480c-a177-ca3c55d5a7fd" 00:15:06.642 ], 00:15:06.642 "product_name": "Malloc disk", 00:15:06.642 "block_size": 512, 00:15:06.642 "num_blocks": 65536, 00:15:06.642 "uuid": "84191000-e884-480c-a177-ca3c55d5a7fd", 00:15:06.642 "assigned_rate_limits": { 00:15:06.642 "rw_ios_per_sec": 0, 00:15:06.642 "rw_mbytes_per_sec": 0, 00:15:06.642 "r_mbytes_per_sec": 0, 00:15:06.642 "w_mbytes_per_sec": 0 00:15:06.642 }, 00:15:06.642 "claimed": true, 00:15:06.642 "claim_type": "exclusive_write", 00:15:06.642 "zoned": false, 00:15:06.642 "supported_io_types": { 00:15:06.642 "read": true, 00:15:06.642 "write": true, 00:15:06.642 "unmap": true, 00:15:06.642 "flush": true, 00:15:06.642 "reset": true, 00:15:06.642 "nvme_admin": false, 00:15:06.642 "nvme_io": false, 00:15:06.642 "nvme_io_md": false, 00:15:06.642 "write_zeroes": true, 00:15:06.642 "zcopy": true, 00:15:06.642 "get_zone_info": false, 00:15:06.642 "zone_management": false, 00:15:06.642 "zone_append": false, 00:15:06.642 "compare": false, 00:15:06.642 "compare_and_write": false, 00:15:06.642 "abort": true, 00:15:06.642 "seek_hole": false, 00:15:06.642 "seek_data": false, 00:15:06.642 "copy": true, 00:15:06.642 "nvme_iov_md": false 00:15:06.642 }, 00:15:06.642 "memory_domains": [ 00:15:06.642 { 00:15:06.642 "dma_device_id": "system", 00:15:06.642 "dma_device_type": 1 00:15:06.642 }, 00:15:06.642 { 00:15:06.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.642 "dma_device_type": 2 00:15:06.642 } 00:15:06.642 ], 00:15:06.642 "driver_specific": {} 00:15:06.642 } 00:15:06.642 ] 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.642 17:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.902 17:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.902 "name": "Existed_Raid", 00:15:06.902 "uuid": "4596b9a1-2e94-47a6-ac54-071eb78445ab", 00:15:06.902 "strip_size_kb": 0, 00:15:06.902 "state": "configuring", 00:15:06.902 "raid_level": "raid1", 00:15:06.902 "superblock": true, 00:15:06.902 "num_base_bdevs": 3, 00:15:06.902 "num_base_bdevs_discovered": 2, 00:15:06.902 "num_base_bdevs_operational": 3, 00:15:06.902 "base_bdevs_list": [ 00:15:06.902 { 00:15:06.902 "name": "BaseBdev1", 00:15:06.902 "uuid": "f8340761-2738-4072-a0b1-97935157eaff", 00:15:06.902 "is_configured": true, 00:15:06.902 "data_offset": 2048, 00:15:06.902 "data_size": 63488 00:15:06.902 }, 00:15:06.902 { 00:15:06.902 "name": "BaseBdev2", 00:15:06.902 "uuid": "84191000-e884-480c-a177-ca3c55d5a7fd", 00:15:06.902 "is_configured": true, 00:15:06.902 "data_offset": 2048, 00:15:06.902 "data_size": 63488 00:15:06.902 }, 00:15:06.902 { 00:15:06.902 "name": "BaseBdev3", 00:15:06.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.902 "is_configured": false, 00:15:06.902 "data_offset": 0, 00:15:06.902 "data_size": 0 00:15:06.902 } 00:15:06.902 ] 00:15:06.902 }' 00:15:06.902 17:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.902 17:27:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.472 17:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:07.732 [2024-07-15 17:27:18.796208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:07.732 [2024-07-15 17:27:18.796331] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e6ee90 00:15:07.732 [2024-07-15 17:27:18.796339] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:07.732 [2024-07-15 17:27:18.796476] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e6eb60 00:15:07.732 [2024-07-15 17:27:18.796573] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e6ee90 00:15:07.732 [2024-07-15 17:27:18.796579] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e6ee90 00:15:07.732 [2024-07-15 17:27:18.796646] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:07.732 BaseBdev3 00:15:07.732 17:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:07.732 17:27:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:07.732 17:27:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:07.732 17:27:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:07.732 17:27:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:07.732 17:27:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:07.732 17:27:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:07.732 17:27:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:07.992 [ 00:15:07.992 { 00:15:07.992 "name": "BaseBdev3", 00:15:07.992 "aliases": [ 00:15:07.992 "9c4a5f12-c376-401f-be2d-d2b81630ca4a" 00:15:07.992 ], 00:15:07.992 "product_name": "Malloc disk", 00:15:07.992 "block_size": 512, 00:15:07.992 "num_blocks": 65536, 00:15:07.992 "uuid": "9c4a5f12-c376-401f-be2d-d2b81630ca4a", 00:15:07.992 "assigned_rate_limits": { 00:15:07.992 "rw_ios_per_sec": 0, 00:15:07.992 "rw_mbytes_per_sec": 0, 00:15:07.992 "r_mbytes_per_sec": 0, 00:15:07.992 "w_mbytes_per_sec": 0 00:15:07.992 }, 00:15:07.992 "claimed": true, 00:15:07.992 "claim_type": "exclusive_write", 00:15:07.992 "zoned": false, 00:15:07.992 "supported_io_types": { 00:15:07.992 "read": true, 00:15:07.992 "write": true, 00:15:07.992 "unmap": true, 00:15:07.992 "flush": true, 00:15:07.992 "reset": true, 00:15:07.992 "nvme_admin": false, 00:15:07.992 "nvme_io": false, 00:15:07.992 "nvme_io_md": false, 00:15:07.992 "write_zeroes": true, 00:15:07.992 "zcopy": true, 00:15:07.992 "get_zone_info": false, 00:15:07.992 "zone_management": false, 00:15:07.992 "zone_append": false, 00:15:07.992 "compare": false, 00:15:07.992 "compare_and_write": false, 00:15:07.992 "abort": true, 00:15:07.992 "seek_hole": false, 00:15:07.992 "seek_data": false, 00:15:07.992 "copy": true, 00:15:07.992 "nvme_iov_md": false 00:15:07.992 }, 00:15:07.992 "memory_domains": [ 00:15:07.992 { 00:15:07.992 "dma_device_id": "system", 00:15:07.992 "dma_device_type": 1 00:15:07.992 }, 00:15:07.992 { 00:15:07.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:07.992 "dma_device_type": 2 00:15:07.992 } 00:15:07.992 ], 00:15:07.992 "driver_specific": {} 00:15:07.992 } 00:15:07.992 ] 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.992 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:08.252 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.252 "name": "Existed_Raid", 00:15:08.252 "uuid": "4596b9a1-2e94-47a6-ac54-071eb78445ab", 00:15:08.252 "strip_size_kb": 0, 00:15:08.252 "state": "online", 00:15:08.252 "raid_level": "raid1", 00:15:08.252 "superblock": true, 00:15:08.252 "num_base_bdevs": 3, 00:15:08.252 "num_base_bdevs_discovered": 3, 00:15:08.252 "num_base_bdevs_operational": 3, 00:15:08.252 "base_bdevs_list": [ 00:15:08.252 { 00:15:08.252 "name": "BaseBdev1", 00:15:08.252 "uuid": "f8340761-2738-4072-a0b1-97935157eaff", 00:15:08.252 "is_configured": true, 00:15:08.252 "data_offset": 2048, 00:15:08.252 "data_size": 63488 00:15:08.252 }, 00:15:08.252 { 00:15:08.252 "name": "BaseBdev2", 00:15:08.252 "uuid": "84191000-e884-480c-a177-ca3c55d5a7fd", 00:15:08.252 "is_configured": true, 00:15:08.252 "data_offset": 2048, 00:15:08.252 "data_size": 63488 00:15:08.252 }, 00:15:08.252 { 00:15:08.252 "name": "BaseBdev3", 00:15:08.252 "uuid": "9c4a5f12-c376-401f-be2d-d2b81630ca4a", 00:15:08.252 "is_configured": true, 00:15:08.252 "data_offset": 2048, 00:15:08.252 "data_size": 63488 00:15:08.252 } 00:15:08.252 ] 00:15:08.252 }' 00:15:08.252 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.252 17:27:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:08.822 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:08.822 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:08.822 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:08.822 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:08.822 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:08.822 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:08.822 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:08.822 17:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:08.822 [2024-07-15 17:27:20.103755] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:09.082 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:09.082 "name": "Existed_Raid", 00:15:09.082 "aliases": [ 00:15:09.082 "4596b9a1-2e94-47a6-ac54-071eb78445ab" 00:15:09.082 ], 00:15:09.082 "product_name": "Raid Volume", 00:15:09.082 "block_size": 512, 00:15:09.082 "num_blocks": 63488, 00:15:09.082 "uuid": "4596b9a1-2e94-47a6-ac54-071eb78445ab", 00:15:09.082 "assigned_rate_limits": { 00:15:09.082 "rw_ios_per_sec": 0, 00:15:09.082 "rw_mbytes_per_sec": 0, 00:15:09.082 "r_mbytes_per_sec": 0, 00:15:09.082 "w_mbytes_per_sec": 0 00:15:09.082 }, 00:15:09.082 "claimed": false, 00:15:09.082 "zoned": false, 00:15:09.082 "supported_io_types": { 00:15:09.082 "read": true, 00:15:09.082 "write": true, 00:15:09.082 "unmap": false, 00:15:09.082 "flush": false, 00:15:09.082 "reset": true, 00:15:09.082 "nvme_admin": false, 00:15:09.082 "nvme_io": false, 00:15:09.082 "nvme_io_md": false, 00:15:09.082 "write_zeroes": true, 00:15:09.082 "zcopy": false, 00:15:09.082 "get_zone_info": false, 00:15:09.082 "zone_management": false, 00:15:09.082 "zone_append": false, 00:15:09.082 "compare": false, 00:15:09.082 "compare_and_write": false, 00:15:09.082 "abort": false, 00:15:09.082 "seek_hole": false, 00:15:09.082 "seek_data": false, 00:15:09.082 "copy": false, 00:15:09.082 "nvme_iov_md": false 00:15:09.082 }, 00:15:09.082 "memory_domains": [ 00:15:09.082 { 00:15:09.082 "dma_device_id": "system", 00:15:09.082 "dma_device_type": 1 00:15:09.082 }, 00:15:09.082 { 00:15:09.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.082 "dma_device_type": 2 00:15:09.082 }, 00:15:09.082 { 00:15:09.082 "dma_device_id": "system", 00:15:09.082 "dma_device_type": 1 00:15:09.082 }, 00:15:09.082 { 00:15:09.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.082 "dma_device_type": 2 00:15:09.082 }, 00:15:09.082 { 00:15:09.082 "dma_device_id": "system", 00:15:09.082 "dma_device_type": 1 00:15:09.082 }, 00:15:09.082 { 00:15:09.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.082 "dma_device_type": 2 00:15:09.082 } 00:15:09.082 ], 00:15:09.082 "driver_specific": { 00:15:09.082 "raid": { 00:15:09.082 "uuid": "4596b9a1-2e94-47a6-ac54-071eb78445ab", 00:15:09.082 "strip_size_kb": 0, 00:15:09.082 "state": "online", 00:15:09.082 "raid_level": "raid1", 00:15:09.082 "superblock": true, 00:15:09.082 "num_base_bdevs": 3, 00:15:09.082 "num_base_bdevs_discovered": 3, 00:15:09.082 "num_base_bdevs_operational": 3, 00:15:09.082 "base_bdevs_list": [ 00:15:09.082 { 00:15:09.082 "name": "BaseBdev1", 00:15:09.082 "uuid": "f8340761-2738-4072-a0b1-97935157eaff", 00:15:09.082 "is_configured": true, 00:15:09.082 "data_offset": 2048, 00:15:09.082 "data_size": 63488 00:15:09.082 }, 00:15:09.082 { 00:15:09.082 "name": "BaseBdev2", 00:15:09.082 "uuid": "84191000-e884-480c-a177-ca3c55d5a7fd", 00:15:09.082 "is_configured": true, 00:15:09.082 "data_offset": 2048, 00:15:09.082 "data_size": 63488 00:15:09.082 }, 00:15:09.082 { 00:15:09.082 "name": "BaseBdev3", 00:15:09.082 "uuid": "9c4a5f12-c376-401f-be2d-d2b81630ca4a", 00:15:09.082 "is_configured": true, 00:15:09.082 "data_offset": 2048, 00:15:09.082 "data_size": 63488 00:15:09.082 } 00:15:09.082 ] 00:15:09.082 } 00:15:09.082 } 00:15:09.082 }' 00:15:09.082 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:09.082 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:09.082 BaseBdev2 00:15:09.082 BaseBdev3' 00:15:09.082 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:09.082 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:09.082 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.082 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.082 "name": "BaseBdev1", 00:15:09.082 "aliases": [ 00:15:09.082 "f8340761-2738-4072-a0b1-97935157eaff" 00:15:09.082 ], 00:15:09.082 "product_name": "Malloc disk", 00:15:09.082 "block_size": 512, 00:15:09.082 "num_blocks": 65536, 00:15:09.082 "uuid": "f8340761-2738-4072-a0b1-97935157eaff", 00:15:09.082 "assigned_rate_limits": { 00:15:09.082 "rw_ios_per_sec": 0, 00:15:09.082 "rw_mbytes_per_sec": 0, 00:15:09.082 "r_mbytes_per_sec": 0, 00:15:09.082 "w_mbytes_per_sec": 0 00:15:09.082 }, 00:15:09.082 "claimed": true, 00:15:09.082 "claim_type": "exclusive_write", 00:15:09.083 "zoned": false, 00:15:09.083 "supported_io_types": { 00:15:09.083 "read": true, 00:15:09.083 "write": true, 00:15:09.083 "unmap": true, 00:15:09.083 "flush": true, 00:15:09.083 "reset": true, 00:15:09.083 "nvme_admin": false, 00:15:09.083 "nvme_io": false, 00:15:09.083 "nvme_io_md": false, 00:15:09.083 "write_zeroes": true, 00:15:09.083 "zcopy": true, 00:15:09.083 "get_zone_info": false, 00:15:09.083 "zone_management": false, 00:15:09.083 "zone_append": false, 00:15:09.083 "compare": false, 00:15:09.083 "compare_and_write": false, 00:15:09.083 "abort": true, 00:15:09.083 "seek_hole": false, 00:15:09.083 "seek_data": false, 00:15:09.083 "copy": true, 00:15:09.083 "nvme_iov_md": false 00:15:09.083 }, 00:15:09.083 "memory_domains": [ 00:15:09.083 { 00:15:09.083 "dma_device_id": "system", 00:15:09.083 "dma_device_type": 1 00:15:09.083 }, 00:15:09.083 { 00:15:09.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.083 "dma_device_type": 2 00:15:09.083 } 00:15:09.083 ], 00:15:09.083 "driver_specific": {} 00:15:09.083 }' 00:15:09.083 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.342 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.342 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:09.342 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.342 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.342 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:09.342 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.602 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.602 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:09.602 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.602 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.602 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:09.602 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:09.602 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:09.602 17:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.862 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.862 "name": "BaseBdev2", 00:15:09.862 "aliases": [ 00:15:09.862 "84191000-e884-480c-a177-ca3c55d5a7fd" 00:15:09.862 ], 00:15:09.862 "product_name": "Malloc disk", 00:15:09.862 "block_size": 512, 00:15:09.862 "num_blocks": 65536, 00:15:09.862 "uuid": "84191000-e884-480c-a177-ca3c55d5a7fd", 00:15:09.862 "assigned_rate_limits": { 00:15:09.862 "rw_ios_per_sec": 0, 00:15:09.862 "rw_mbytes_per_sec": 0, 00:15:09.862 "r_mbytes_per_sec": 0, 00:15:09.862 "w_mbytes_per_sec": 0 00:15:09.862 }, 00:15:09.862 "claimed": true, 00:15:09.862 "claim_type": "exclusive_write", 00:15:09.862 "zoned": false, 00:15:09.862 "supported_io_types": { 00:15:09.862 "read": true, 00:15:09.862 "write": true, 00:15:09.862 "unmap": true, 00:15:09.862 "flush": true, 00:15:09.862 "reset": true, 00:15:09.862 "nvme_admin": false, 00:15:09.862 "nvme_io": false, 00:15:09.862 "nvme_io_md": false, 00:15:09.862 "write_zeroes": true, 00:15:09.862 "zcopy": true, 00:15:09.862 "get_zone_info": false, 00:15:09.862 "zone_management": false, 00:15:09.862 "zone_append": false, 00:15:09.862 "compare": false, 00:15:09.862 "compare_and_write": false, 00:15:09.862 "abort": true, 00:15:09.862 "seek_hole": false, 00:15:09.862 "seek_data": false, 00:15:09.862 "copy": true, 00:15:09.862 "nvme_iov_md": false 00:15:09.862 }, 00:15:09.862 "memory_domains": [ 00:15:09.862 { 00:15:09.862 "dma_device_id": "system", 00:15:09.862 "dma_device_type": 1 00:15:09.862 }, 00:15:09.862 { 00:15:09.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.862 "dma_device_type": 2 00:15:09.862 } 00:15:09.862 ], 00:15:09.862 "driver_specific": {} 00:15:09.862 }' 00:15:09.862 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.862 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.862 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:09.862 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.123 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.123 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:10.123 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.123 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.123 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.123 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.123 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.384 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.384 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:10.384 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:10.384 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:10.644 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:10.644 "name": "BaseBdev3", 00:15:10.644 "aliases": [ 00:15:10.644 "9c4a5f12-c376-401f-be2d-d2b81630ca4a" 00:15:10.644 ], 00:15:10.644 "product_name": "Malloc disk", 00:15:10.644 "block_size": 512, 00:15:10.644 "num_blocks": 65536, 00:15:10.644 "uuid": "9c4a5f12-c376-401f-be2d-d2b81630ca4a", 00:15:10.644 "assigned_rate_limits": { 00:15:10.644 "rw_ios_per_sec": 0, 00:15:10.644 "rw_mbytes_per_sec": 0, 00:15:10.644 "r_mbytes_per_sec": 0, 00:15:10.644 "w_mbytes_per_sec": 0 00:15:10.644 }, 00:15:10.644 "claimed": true, 00:15:10.644 "claim_type": "exclusive_write", 00:15:10.644 "zoned": false, 00:15:10.644 "supported_io_types": { 00:15:10.644 "read": true, 00:15:10.644 "write": true, 00:15:10.644 "unmap": true, 00:15:10.644 "flush": true, 00:15:10.644 "reset": true, 00:15:10.644 "nvme_admin": false, 00:15:10.644 "nvme_io": false, 00:15:10.644 "nvme_io_md": false, 00:15:10.644 "write_zeroes": true, 00:15:10.644 "zcopy": true, 00:15:10.644 "get_zone_info": false, 00:15:10.644 "zone_management": false, 00:15:10.644 "zone_append": false, 00:15:10.644 "compare": false, 00:15:10.644 "compare_and_write": false, 00:15:10.644 "abort": true, 00:15:10.644 "seek_hole": false, 00:15:10.644 "seek_data": false, 00:15:10.644 "copy": true, 00:15:10.644 "nvme_iov_md": false 00:15:10.644 }, 00:15:10.644 "memory_domains": [ 00:15:10.644 { 00:15:10.644 "dma_device_id": "system", 00:15:10.644 "dma_device_type": 1 00:15:10.644 }, 00:15:10.644 { 00:15:10.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.644 "dma_device_type": 2 00:15:10.644 } 00:15:10.644 ], 00:15:10.644 "driver_specific": {} 00:15:10.644 }' 00:15:10.644 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.644 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.644 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:10.644 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.644 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.644 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:10.644 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.644 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.904 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.904 17:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.904 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.904 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.904 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:11.164 [2024-07-15 17:27:22.236953] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.164 "name": "Existed_Raid", 00:15:11.164 "uuid": "4596b9a1-2e94-47a6-ac54-071eb78445ab", 00:15:11.164 "strip_size_kb": 0, 00:15:11.164 "state": "online", 00:15:11.164 "raid_level": "raid1", 00:15:11.164 "superblock": true, 00:15:11.164 "num_base_bdevs": 3, 00:15:11.164 "num_base_bdevs_discovered": 2, 00:15:11.164 "num_base_bdevs_operational": 2, 00:15:11.164 "base_bdevs_list": [ 00:15:11.164 { 00:15:11.164 "name": null, 00:15:11.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.164 "is_configured": false, 00:15:11.164 "data_offset": 2048, 00:15:11.164 "data_size": 63488 00:15:11.164 }, 00:15:11.164 { 00:15:11.164 "name": "BaseBdev2", 00:15:11.164 "uuid": "84191000-e884-480c-a177-ca3c55d5a7fd", 00:15:11.164 "is_configured": true, 00:15:11.164 "data_offset": 2048, 00:15:11.164 "data_size": 63488 00:15:11.164 }, 00:15:11.164 { 00:15:11.164 "name": "BaseBdev3", 00:15:11.164 "uuid": "9c4a5f12-c376-401f-be2d-d2b81630ca4a", 00:15:11.164 "is_configured": true, 00:15:11.164 "data_offset": 2048, 00:15:11.164 "data_size": 63488 00:15:11.164 } 00:15:11.164 ] 00:15:11.164 }' 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.164 17:27:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.732 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:11.732 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:11.732 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:11.732 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.991 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:11.991 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:11.991 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:12.250 [2024-07-15 17:27:23.379863] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:12.250 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:12.250 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:12.250 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.250 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:12.509 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:12.509 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:12.509 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:12.509 [2024-07-15 17:27:23.770751] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:12.509 [2024-07-15 17:27:23.770813] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:12.509 [2024-07-15 17:27:23.776798] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:12.509 [2024-07-15 17:27:23.776825] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:12.509 [2024-07-15 17:27:23.776831] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e6ee90 name Existed_Raid, state offline 00:15:12.509 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:12.509 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:12.509 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.509 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:12.769 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:12.769 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:12.769 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:12.769 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:12.769 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:12.769 17:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:13.028 BaseBdev2 00:15:13.028 17:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:13.028 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:13.028 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:13.028 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:13.028 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:13.028 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:13.028 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:13.288 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:13.288 [ 00:15:13.288 { 00:15:13.288 "name": "BaseBdev2", 00:15:13.288 "aliases": [ 00:15:13.288 "072c3b18-db47-4f86-ad1d-065c2856e00f" 00:15:13.288 ], 00:15:13.288 "product_name": "Malloc disk", 00:15:13.288 "block_size": 512, 00:15:13.288 "num_blocks": 65536, 00:15:13.288 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:13.288 "assigned_rate_limits": { 00:15:13.288 "rw_ios_per_sec": 0, 00:15:13.288 "rw_mbytes_per_sec": 0, 00:15:13.288 "r_mbytes_per_sec": 0, 00:15:13.288 "w_mbytes_per_sec": 0 00:15:13.288 }, 00:15:13.288 "claimed": false, 00:15:13.288 "zoned": false, 00:15:13.288 "supported_io_types": { 00:15:13.288 "read": true, 00:15:13.288 "write": true, 00:15:13.288 "unmap": true, 00:15:13.288 "flush": true, 00:15:13.288 "reset": true, 00:15:13.288 "nvme_admin": false, 00:15:13.288 "nvme_io": false, 00:15:13.288 "nvme_io_md": false, 00:15:13.288 "write_zeroes": true, 00:15:13.288 "zcopy": true, 00:15:13.288 "get_zone_info": false, 00:15:13.288 "zone_management": false, 00:15:13.288 "zone_append": false, 00:15:13.288 "compare": false, 00:15:13.288 "compare_and_write": false, 00:15:13.288 "abort": true, 00:15:13.288 "seek_hole": false, 00:15:13.288 "seek_data": false, 00:15:13.288 "copy": true, 00:15:13.288 "nvme_iov_md": false 00:15:13.288 }, 00:15:13.288 "memory_domains": [ 00:15:13.288 { 00:15:13.288 "dma_device_id": "system", 00:15:13.288 "dma_device_type": 1 00:15:13.288 }, 00:15:13.288 { 00:15:13.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.288 "dma_device_type": 2 00:15:13.288 } 00:15:13.288 ], 00:15:13.288 "driver_specific": {} 00:15:13.288 } 00:15:13.288 ] 00:15:13.288 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:13.288 17:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:13.288 17:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:13.288 17:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:13.548 BaseBdev3 00:15:13.548 17:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:13.548 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:13.548 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:13.548 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:13.548 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:13.548 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:13.548 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:13.835 17:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:13.835 [ 00:15:13.835 { 00:15:13.835 "name": "BaseBdev3", 00:15:13.835 "aliases": [ 00:15:13.835 "66604fbc-01e8-465c-8f31-863a6db6acd0" 00:15:13.835 ], 00:15:13.835 "product_name": "Malloc disk", 00:15:13.835 "block_size": 512, 00:15:13.835 "num_blocks": 65536, 00:15:13.835 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:13.835 "assigned_rate_limits": { 00:15:13.835 "rw_ios_per_sec": 0, 00:15:13.835 "rw_mbytes_per_sec": 0, 00:15:13.835 "r_mbytes_per_sec": 0, 00:15:13.835 "w_mbytes_per_sec": 0 00:15:13.835 }, 00:15:13.835 "claimed": false, 00:15:13.835 "zoned": false, 00:15:13.835 "supported_io_types": { 00:15:13.835 "read": true, 00:15:13.835 "write": true, 00:15:13.835 "unmap": true, 00:15:13.835 "flush": true, 00:15:13.835 "reset": true, 00:15:13.835 "nvme_admin": false, 00:15:13.835 "nvme_io": false, 00:15:13.835 "nvme_io_md": false, 00:15:13.835 "write_zeroes": true, 00:15:13.835 "zcopy": true, 00:15:13.835 "get_zone_info": false, 00:15:13.835 "zone_management": false, 00:15:13.835 "zone_append": false, 00:15:13.835 "compare": false, 00:15:13.835 "compare_and_write": false, 00:15:13.835 "abort": true, 00:15:13.835 "seek_hole": false, 00:15:13.835 "seek_data": false, 00:15:13.835 "copy": true, 00:15:13.835 "nvme_iov_md": false 00:15:13.835 }, 00:15:13.835 "memory_domains": [ 00:15:13.835 { 00:15:13.835 "dma_device_id": "system", 00:15:13.835 "dma_device_type": 1 00:15:13.835 }, 00:15:13.835 { 00:15:13.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.835 "dma_device_type": 2 00:15:13.835 } 00:15:13.835 ], 00:15:13.835 "driver_specific": {} 00:15:13.835 } 00:15:13.835 ] 00:15:13.835 17:27:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:13.835 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:13.835 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:13.835 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:14.099 [2024-07-15 17:27:25.274421] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:14.099 [2024-07-15 17:27:25.274449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:14.099 [2024-07-15 17:27:25.274461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:14.099 [2024-07-15 17:27:25.275498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.099 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.360 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.360 "name": "Existed_Raid", 00:15:14.360 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:14.360 "strip_size_kb": 0, 00:15:14.360 "state": "configuring", 00:15:14.360 "raid_level": "raid1", 00:15:14.360 "superblock": true, 00:15:14.360 "num_base_bdevs": 3, 00:15:14.360 "num_base_bdevs_discovered": 2, 00:15:14.360 "num_base_bdevs_operational": 3, 00:15:14.360 "base_bdevs_list": [ 00:15:14.360 { 00:15:14.360 "name": "BaseBdev1", 00:15:14.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.360 "is_configured": false, 00:15:14.360 "data_offset": 0, 00:15:14.360 "data_size": 0 00:15:14.360 }, 00:15:14.360 { 00:15:14.360 "name": "BaseBdev2", 00:15:14.360 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:14.360 "is_configured": true, 00:15:14.360 "data_offset": 2048, 00:15:14.360 "data_size": 63488 00:15:14.360 }, 00:15:14.360 { 00:15:14.360 "name": "BaseBdev3", 00:15:14.360 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:14.360 "is_configured": true, 00:15:14.360 "data_offset": 2048, 00:15:14.360 "data_size": 63488 00:15:14.360 } 00:15:14.360 ] 00:15:14.360 }' 00:15:14.360 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.360 17:27:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:14.930 17:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:14.930 [2024-07-15 17:27:26.156639] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.930 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:15.190 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.190 "name": "Existed_Raid", 00:15:15.190 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:15.190 "strip_size_kb": 0, 00:15:15.190 "state": "configuring", 00:15:15.190 "raid_level": "raid1", 00:15:15.190 "superblock": true, 00:15:15.190 "num_base_bdevs": 3, 00:15:15.190 "num_base_bdevs_discovered": 1, 00:15:15.190 "num_base_bdevs_operational": 3, 00:15:15.190 "base_bdevs_list": [ 00:15:15.190 { 00:15:15.190 "name": "BaseBdev1", 00:15:15.190 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.190 "is_configured": false, 00:15:15.190 "data_offset": 0, 00:15:15.190 "data_size": 0 00:15:15.190 }, 00:15:15.190 { 00:15:15.190 "name": null, 00:15:15.190 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:15.190 "is_configured": false, 00:15:15.190 "data_offset": 2048, 00:15:15.190 "data_size": 63488 00:15:15.190 }, 00:15:15.190 { 00:15:15.190 "name": "BaseBdev3", 00:15:15.190 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:15.190 "is_configured": true, 00:15:15.190 "data_offset": 2048, 00:15:15.190 "data_size": 63488 00:15:15.190 } 00:15:15.190 ] 00:15:15.190 }' 00:15:15.190 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.190 17:27:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:15.759 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.759 17:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:16.019 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:16.019 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:16.019 [2024-07-15 17:27:27.316696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:16.280 BaseBdev1 00:15:16.280 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:16.280 17:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:16.280 17:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:16.280 17:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:16.280 17:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:16.280 17:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:16.280 17:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:16.280 17:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:16.540 [ 00:15:16.540 { 00:15:16.540 "name": "BaseBdev1", 00:15:16.540 "aliases": [ 00:15:16.540 "79ea6019-28a0-4974-82c8-3fe3d2cfae02" 00:15:16.540 ], 00:15:16.540 "product_name": "Malloc disk", 00:15:16.540 "block_size": 512, 00:15:16.540 "num_blocks": 65536, 00:15:16.540 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:16.540 "assigned_rate_limits": { 00:15:16.540 "rw_ios_per_sec": 0, 00:15:16.540 "rw_mbytes_per_sec": 0, 00:15:16.540 "r_mbytes_per_sec": 0, 00:15:16.540 "w_mbytes_per_sec": 0 00:15:16.540 }, 00:15:16.540 "claimed": true, 00:15:16.540 "claim_type": "exclusive_write", 00:15:16.540 "zoned": false, 00:15:16.540 "supported_io_types": { 00:15:16.540 "read": true, 00:15:16.540 "write": true, 00:15:16.540 "unmap": true, 00:15:16.540 "flush": true, 00:15:16.540 "reset": true, 00:15:16.540 "nvme_admin": false, 00:15:16.540 "nvme_io": false, 00:15:16.540 "nvme_io_md": false, 00:15:16.540 "write_zeroes": true, 00:15:16.540 "zcopy": true, 00:15:16.540 "get_zone_info": false, 00:15:16.540 "zone_management": false, 00:15:16.540 "zone_append": false, 00:15:16.540 "compare": false, 00:15:16.540 "compare_and_write": false, 00:15:16.540 "abort": true, 00:15:16.540 "seek_hole": false, 00:15:16.540 "seek_data": false, 00:15:16.540 "copy": true, 00:15:16.540 "nvme_iov_md": false 00:15:16.540 }, 00:15:16.540 "memory_domains": [ 00:15:16.540 { 00:15:16.540 "dma_device_id": "system", 00:15:16.540 "dma_device_type": 1 00:15:16.540 }, 00:15:16.540 { 00:15:16.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.540 "dma_device_type": 2 00:15:16.540 } 00:15:16.540 ], 00:15:16.540 "driver_specific": {} 00:15:16.540 } 00:15:16.540 ] 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.540 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.801 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.801 "name": "Existed_Raid", 00:15:16.801 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:16.801 "strip_size_kb": 0, 00:15:16.801 "state": "configuring", 00:15:16.801 "raid_level": "raid1", 00:15:16.801 "superblock": true, 00:15:16.801 "num_base_bdevs": 3, 00:15:16.801 "num_base_bdevs_discovered": 2, 00:15:16.801 "num_base_bdevs_operational": 3, 00:15:16.801 "base_bdevs_list": [ 00:15:16.801 { 00:15:16.801 "name": "BaseBdev1", 00:15:16.801 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:16.801 "is_configured": true, 00:15:16.801 "data_offset": 2048, 00:15:16.801 "data_size": 63488 00:15:16.801 }, 00:15:16.801 { 00:15:16.801 "name": null, 00:15:16.801 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:16.801 "is_configured": false, 00:15:16.801 "data_offset": 2048, 00:15:16.801 "data_size": 63488 00:15:16.801 }, 00:15:16.801 { 00:15:16.801 "name": "BaseBdev3", 00:15:16.801 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:16.801 "is_configured": true, 00:15:16.801 "data_offset": 2048, 00:15:16.801 "data_size": 63488 00:15:16.801 } 00:15:16.801 ] 00:15:16.801 }' 00:15:16.801 17:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.801 17:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:17.371 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.371 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:17.371 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:17.371 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:17.632 [2024-07-15 17:27:28.844585] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.632 17:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.892 17:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.892 "name": "Existed_Raid", 00:15:17.892 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:17.892 "strip_size_kb": 0, 00:15:17.892 "state": "configuring", 00:15:17.892 "raid_level": "raid1", 00:15:17.892 "superblock": true, 00:15:17.892 "num_base_bdevs": 3, 00:15:17.892 "num_base_bdevs_discovered": 1, 00:15:17.892 "num_base_bdevs_operational": 3, 00:15:17.892 "base_bdevs_list": [ 00:15:17.892 { 00:15:17.892 "name": "BaseBdev1", 00:15:17.892 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:17.892 "is_configured": true, 00:15:17.892 "data_offset": 2048, 00:15:17.892 "data_size": 63488 00:15:17.892 }, 00:15:17.892 { 00:15:17.892 "name": null, 00:15:17.892 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:17.892 "is_configured": false, 00:15:17.892 "data_offset": 2048, 00:15:17.892 "data_size": 63488 00:15:17.892 }, 00:15:17.892 { 00:15:17.892 "name": null, 00:15:17.892 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:17.892 "is_configured": false, 00:15:17.892 "data_offset": 2048, 00:15:17.892 "data_size": 63488 00:15:17.892 } 00:15:17.892 ] 00:15:17.892 }' 00:15:17.892 17:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.892 17:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:18.464 17:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.464 17:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:18.724 17:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:18.724 17:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:18.724 [2024-07-15 17:27:30.007651] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:18.984 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:18.984 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.984 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:18.984 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:18.984 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:18.984 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:18.984 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.985 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.985 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.985 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.985 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.985 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.985 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.985 "name": "Existed_Raid", 00:15:18.985 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:18.985 "strip_size_kb": 0, 00:15:18.985 "state": "configuring", 00:15:18.985 "raid_level": "raid1", 00:15:18.985 "superblock": true, 00:15:18.985 "num_base_bdevs": 3, 00:15:18.985 "num_base_bdevs_discovered": 2, 00:15:18.985 "num_base_bdevs_operational": 3, 00:15:18.985 "base_bdevs_list": [ 00:15:18.985 { 00:15:18.985 "name": "BaseBdev1", 00:15:18.985 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:18.985 "is_configured": true, 00:15:18.985 "data_offset": 2048, 00:15:18.985 "data_size": 63488 00:15:18.985 }, 00:15:18.985 { 00:15:18.985 "name": null, 00:15:18.985 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:18.985 "is_configured": false, 00:15:18.985 "data_offset": 2048, 00:15:18.985 "data_size": 63488 00:15:18.985 }, 00:15:18.985 { 00:15:18.985 "name": "BaseBdev3", 00:15:18.985 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:18.985 "is_configured": true, 00:15:18.985 "data_offset": 2048, 00:15:18.985 "data_size": 63488 00:15:18.985 } 00:15:18.985 ] 00:15:18.985 }' 00:15:18.985 17:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.985 17:27:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:19.925 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.925 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:20.184 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:20.184 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:20.184 [2024-07-15 17:27:31.479397] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.445 "name": "Existed_Raid", 00:15:20.445 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:20.445 "strip_size_kb": 0, 00:15:20.445 "state": "configuring", 00:15:20.445 "raid_level": "raid1", 00:15:20.445 "superblock": true, 00:15:20.445 "num_base_bdevs": 3, 00:15:20.445 "num_base_bdevs_discovered": 1, 00:15:20.445 "num_base_bdevs_operational": 3, 00:15:20.445 "base_bdevs_list": [ 00:15:20.445 { 00:15:20.445 "name": null, 00:15:20.445 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:20.445 "is_configured": false, 00:15:20.445 "data_offset": 2048, 00:15:20.445 "data_size": 63488 00:15:20.445 }, 00:15:20.445 { 00:15:20.445 "name": null, 00:15:20.445 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:20.445 "is_configured": false, 00:15:20.445 "data_offset": 2048, 00:15:20.445 "data_size": 63488 00:15:20.445 }, 00:15:20.445 { 00:15:20.445 "name": "BaseBdev3", 00:15:20.445 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:20.445 "is_configured": true, 00:15:20.445 "data_offset": 2048, 00:15:20.445 "data_size": 63488 00:15:20.445 } 00:15:20.445 ] 00:15:20.445 }' 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.445 17:27:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:21.016 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.016 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:21.275 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:21.275 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:21.537 [2024-07-15 17:27:32.647927] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.537 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.797 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.797 "name": "Existed_Raid", 00:15:21.797 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:21.797 "strip_size_kb": 0, 00:15:21.797 "state": "configuring", 00:15:21.797 "raid_level": "raid1", 00:15:21.797 "superblock": true, 00:15:21.797 "num_base_bdevs": 3, 00:15:21.797 "num_base_bdevs_discovered": 2, 00:15:21.797 "num_base_bdevs_operational": 3, 00:15:21.797 "base_bdevs_list": [ 00:15:21.797 { 00:15:21.797 "name": null, 00:15:21.797 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:21.797 "is_configured": false, 00:15:21.797 "data_offset": 2048, 00:15:21.797 "data_size": 63488 00:15:21.797 }, 00:15:21.797 { 00:15:21.797 "name": "BaseBdev2", 00:15:21.797 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:21.797 "is_configured": true, 00:15:21.797 "data_offset": 2048, 00:15:21.797 "data_size": 63488 00:15:21.797 }, 00:15:21.797 { 00:15:21.797 "name": "BaseBdev3", 00:15:21.797 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:21.797 "is_configured": true, 00:15:21.797 "data_offset": 2048, 00:15:21.797 "data_size": 63488 00:15:21.797 } 00:15:21.797 ] 00:15:21.797 }' 00:15:21.797 17:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.797 17:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.367 17:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.367 17:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:22.367 17:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:22.367 17:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.367 17:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:22.628 17:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 79ea6019-28a0-4974-82c8-3fe3d2cfae02 00:15:22.888 [2024-07-15 17:27:33.976311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:22.888 [2024-07-15 17:27:33.976423] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2012890 00:15:22.888 [2024-07-15 17:27:33.976431] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:22.888 [2024-07-15 17:27:33.976569] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e6e6a0 00:15:22.888 [2024-07-15 17:27:33.976660] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2012890 00:15:22.888 [2024-07-15 17:27:33.976665] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2012890 00:15:22.888 [2024-07-15 17:27:33.976742] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:22.888 NewBaseBdev 00:15:22.888 17:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:22.888 17:27:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:22.888 17:27:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:22.888 17:27:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:22.888 17:27:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:22.888 17:27:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:22.888 17:27:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:22.888 17:27:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:23.148 [ 00:15:23.148 { 00:15:23.148 "name": "NewBaseBdev", 00:15:23.148 "aliases": [ 00:15:23.148 "79ea6019-28a0-4974-82c8-3fe3d2cfae02" 00:15:23.148 ], 00:15:23.148 "product_name": "Malloc disk", 00:15:23.148 "block_size": 512, 00:15:23.148 "num_blocks": 65536, 00:15:23.148 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:23.148 "assigned_rate_limits": { 00:15:23.148 "rw_ios_per_sec": 0, 00:15:23.148 "rw_mbytes_per_sec": 0, 00:15:23.148 "r_mbytes_per_sec": 0, 00:15:23.148 "w_mbytes_per_sec": 0 00:15:23.148 }, 00:15:23.148 "claimed": true, 00:15:23.148 "claim_type": "exclusive_write", 00:15:23.148 "zoned": false, 00:15:23.148 "supported_io_types": { 00:15:23.148 "read": true, 00:15:23.148 "write": true, 00:15:23.148 "unmap": true, 00:15:23.148 "flush": true, 00:15:23.148 "reset": true, 00:15:23.148 "nvme_admin": false, 00:15:23.148 "nvme_io": false, 00:15:23.148 "nvme_io_md": false, 00:15:23.148 "write_zeroes": true, 00:15:23.148 "zcopy": true, 00:15:23.148 "get_zone_info": false, 00:15:23.148 "zone_management": false, 00:15:23.148 "zone_append": false, 00:15:23.148 "compare": false, 00:15:23.148 "compare_and_write": false, 00:15:23.148 "abort": true, 00:15:23.148 "seek_hole": false, 00:15:23.148 "seek_data": false, 00:15:23.148 "copy": true, 00:15:23.148 "nvme_iov_md": false 00:15:23.148 }, 00:15:23.148 "memory_domains": [ 00:15:23.148 { 00:15:23.148 "dma_device_id": "system", 00:15:23.148 "dma_device_type": 1 00:15:23.148 }, 00:15:23.148 { 00:15:23.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.148 "dma_device_type": 2 00:15:23.148 } 00:15:23.148 ], 00:15:23.148 "driver_specific": {} 00:15:23.148 } 00:15:23.148 ] 00:15:23.148 17:27:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:23.148 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:23.148 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.148 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:23.149 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:23.149 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:23.149 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:23.149 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.149 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.149 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.149 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.149 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.149 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.409 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.409 "name": "Existed_Raid", 00:15:23.409 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:23.409 "strip_size_kb": 0, 00:15:23.409 "state": "online", 00:15:23.409 "raid_level": "raid1", 00:15:23.409 "superblock": true, 00:15:23.409 "num_base_bdevs": 3, 00:15:23.409 "num_base_bdevs_discovered": 3, 00:15:23.409 "num_base_bdevs_operational": 3, 00:15:23.409 "base_bdevs_list": [ 00:15:23.409 { 00:15:23.409 "name": "NewBaseBdev", 00:15:23.409 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:23.409 "is_configured": true, 00:15:23.409 "data_offset": 2048, 00:15:23.409 "data_size": 63488 00:15:23.409 }, 00:15:23.409 { 00:15:23.409 "name": "BaseBdev2", 00:15:23.409 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:23.409 "is_configured": true, 00:15:23.409 "data_offset": 2048, 00:15:23.409 "data_size": 63488 00:15:23.409 }, 00:15:23.409 { 00:15:23.409 "name": "BaseBdev3", 00:15:23.409 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:23.409 "is_configured": true, 00:15:23.409 "data_offset": 2048, 00:15:23.409 "data_size": 63488 00:15:23.409 } 00:15:23.409 ] 00:15:23.409 }' 00:15:23.409 17:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.409 17:27:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:24.348 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:24.348 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:24.348 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:24.348 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:24.348 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:24.348 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:24.348 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:24.348 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:24.348 [2024-07-15 17:27:35.628760] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:24.608 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:24.608 "name": "Existed_Raid", 00:15:24.608 "aliases": [ 00:15:24.608 "6ecee5d6-3a92-4f74-a435-2e08f7028b07" 00:15:24.608 ], 00:15:24.608 "product_name": "Raid Volume", 00:15:24.608 "block_size": 512, 00:15:24.608 "num_blocks": 63488, 00:15:24.608 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:24.608 "assigned_rate_limits": { 00:15:24.608 "rw_ios_per_sec": 0, 00:15:24.608 "rw_mbytes_per_sec": 0, 00:15:24.608 "r_mbytes_per_sec": 0, 00:15:24.608 "w_mbytes_per_sec": 0 00:15:24.608 }, 00:15:24.608 "claimed": false, 00:15:24.608 "zoned": false, 00:15:24.608 "supported_io_types": { 00:15:24.608 "read": true, 00:15:24.608 "write": true, 00:15:24.608 "unmap": false, 00:15:24.608 "flush": false, 00:15:24.608 "reset": true, 00:15:24.608 "nvme_admin": false, 00:15:24.608 "nvme_io": false, 00:15:24.608 "nvme_io_md": false, 00:15:24.608 "write_zeroes": true, 00:15:24.608 "zcopy": false, 00:15:24.608 "get_zone_info": false, 00:15:24.608 "zone_management": false, 00:15:24.608 "zone_append": false, 00:15:24.608 "compare": false, 00:15:24.608 "compare_and_write": false, 00:15:24.608 "abort": false, 00:15:24.608 "seek_hole": false, 00:15:24.608 "seek_data": false, 00:15:24.608 "copy": false, 00:15:24.608 "nvme_iov_md": false 00:15:24.608 }, 00:15:24.608 "memory_domains": [ 00:15:24.608 { 00:15:24.608 "dma_device_id": "system", 00:15:24.608 "dma_device_type": 1 00:15:24.608 }, 00:15:24.608 { 00:15:24.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.608 "dma_device_type": 2 00:15:24.608 }, 00:15:24.608 { 00:15:24.608 "dma_device_id": "system", 00:15:24.608 "dma_device_type": 1 00:15:24.608 }, 00:15:24.608 { 00:15:24.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.608 "dma_device_type": 2 00:15:24.608 }, 00:15:24.608 { 00:15:24.608 "dma_device_id": "system", 00:15:24.608 "dma_device_type": 1 00:15:24.608 }, 00:15:24.608 { 00:15:24.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.608 "dma_device_type": 2 00:15:24.608 } 00:15:24.608 ], 00:15:24.608 "driver_specific": { 00:15:24.608 "raid": { 00:15:24.608 "uuid": "6ecee5d6-3a92-4f74-a435-2e08f7028b07", 00:15:24.608 "strip_size_kb": 0, 00:15:24.608 "state": "online", 00:15:24.608 "raid_level": "raid1", 00:15:24.608 "superblock": true, 00:15:24.608 "num_base_bdevs": 3, 00:15:24.608 "num_base_bdevs_discovered": 3, 00:15:24.608 "num_base_bdevs_operational": 3, 00:15:24.608 "base_bdevs_list": [ 00:15:24.608 { 00:15:24.608 "name": "NewBaseBdev", 00:15:24.608 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:24.608 "is_configured": true, 00:15:24.608 "data_offset": 2048, 00:15:24.608 "data_size": 63488 00:15:24.608 }, 00:15:24.608 { 00:15:24.608 "name": "BaseBdev2", 00:15:24.608 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:24.608 "is_configured": true, 00:15:24.608 "data_offset": 2048, 00:15:24.608 "data_size": 63488 00:15:24.608 }, 00:15:24.608 { 00:15:24.608 "name": "BaseBdev3", 00:15:24.608 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:24.608 "is_configured": true, 00:15:24.608 "data_offset": 2048, 00:15:24.608 "data_size": 63488 00:15:24.608 } 00:15:24.608 ] 00:15:24.608 } 00:15:24.608 } 00:15:24.608 }' 00:15:24.608 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:24.608 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:24.608 BaseBdev2 00:15:24.608 BaseBdev3' 00:15:24.608 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:24.608 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:24.608 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:24.608 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:24.608 "name": "NewBaseBdev", 00:15:24.608 "aliases": [ 00:15:24.608 "79ea6019-28a0-4974-82c8-3fe3d2cfae02" 00:15:24.608 ], 00:15:24.608 "product_name": "Malloc disk", 00:15:24.608 "block_size": 512, 00:15:24.608 "num_blocks": 65536, 00:15:24.608 "uuid": "79ea6019-28a0-4974-82c8-3fe3d2cfae02", 00:15:24.608 "assigned_rate_limits": { 00:15:24.608 "rw_ios_per_sec": 0, 00:15:24.608 "rw_mbytes_per_sec": 0, 00:15:24.608 "r_mbytes_per_sec": 0, 00:15:24.608 "w_mbytes_per_sec": 0 00:15:24.608 }, 00:15:24.608 "claimed": true, 00:15:24.608 "claim_type": "exclusive_write", 00:15:24.608 "zoned": false, 00:15:24.608 "supported_io_types": { 00:15:24.608 "read": true, 00:15:24.608 "write": true, 00:15:24.608 "unmap": true, 00:15:24.608 "flush": true, 00:15:24.608 "reset": true, 00:15:24.608 "nvme_admin": false, 00:15:24.608 "nvme_io": false, 00:15:24.608 "nvme_io_md": false, 00:15:24.608 "write_zeroes": true, 00:15:24.608 "zcopy": true, 00:15:24.608 "get_zone_info": false, 00:15:24.608 "zone_management": false, 00:15:24.608 "zone_append": false, 00:15:24.608 "compare": false, 00:15:24.608 "compare_and_write": false, 00:15:24.608 "abort": true, 00:15:24.608 "seek_hole": false, 00:15:24.608 "seek_data": false, 00:15:24.608 "copy": true, 00:15:24.608 "nvme_iov_md": false 00:15:24.608 }, 00:15:24.608 "memory_domains": [ 00:15:24.608 { 00:15:24.608 "dma_device_id": "system", 00:15:24.608 "dma_device_type": 1 00:15:24.608 }, 00:15:24.608 { 00:15:24.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.608 "dma_device_type": 2 00:15:24.608 } 00:15:24.608 ], 00:15:24.608 "driver_specific": {} 00:15:24.608 }' 00:15:24.608 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:24.869 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:24.869 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:24.869 17:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:24.869 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:24.869 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:24.869 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:24.869 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:24.869 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:24.869 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.129 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.129 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:25.129 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.129 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:25.129 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:25.404 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:25.404 "name": "BaseBdev2", 00:15:25.404 "aliases": [ 00:15:25.404 "072c3b18-db47-4f86-ad1d-065c2856e00f" 00:15:25.404 ], 00:15:25.404 "product_name": "Malloc disk", 00:15:25.404 "block_size": 512, 00:15:25.404 "num_blocks": 65536, 00:15:25.404 "uuid": "072c3b18-db47-4f86-ad1d-065c2856e00f", 00:15:25.404 "assigned_rate_limits": { 00:15:25.404 "rw_ios_per_sec": 0, 00:15:25.404 "rw_mbytes_per_sec": 0, 00:15:25.404 "r_mbytes_per_sec": 0, 00:15:25.404 "w_mbytes_per_sec": 0 00:15:25.404 }, 00:15:25.404 "claimed": true, 00:15:25.404 "claim_type": "exclusive_write", 00:15:25.404 "zoned": false, 00:15:25.404 "supported_io_types": { 00:15:25.404 "read": true, 00:15:25.404 "write": true, 00:15:25.404 "unmap": true, 00:15:25.404 "flush": true, 00:15:25.404 "reset": true, 00:15:25.404 "nvme_admin": false, 00:15:25.404 "nvme_io": false, 00:15:25.404 "nvme_io_md": false, 00:15:25.404 "write_zeroes": true, 00:15:25.404 "zcopy": true, 00:15:25.404 "get_zone_info": false, 00:15:25.404 "zone_management": false, 00:15:25.404 "zone_append": false, 00:15:25.404 "compare": false, 00:15:25.404 "compare_and_write": false, 00:15:25.404 "abort": true, 00:15:25.404 "seek_hole": false, 00:15:25.404 "seek_data": false, 00:15:25.404 "copy": true, 00:15:25.404 "nvme_iov_md": false 00:15:25.404 }, 00:15:25.404 "memory_domains": [ 00:15:25.404 { 00:15:25.404 "dma_device_id": "system", 00:15:25.404 "dma_device_type": 1 00:15:25.404 }, 00:15:25.404 { 00:15:25.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.404 "dma_device_type": 2 00:15:25.404 } 00:15:25.404 ], 00:15:25.404 "driver_specific": {} 00:15:25.404 }' 00:15:25.404 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.404 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.404 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:25.404 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.404 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:25.665 17:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:25.925 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:25.925 "name": "BaseBdev3", 00:15:25.925 "aliases": [ 00:15:25.925 "66604fbc-01e8-465c-8f31-863a6db6acd0" 00:15:25.925 ], 00:15:25.925 "product_name": "Malloc disk", 00:15:25.925 "block_size": 512, 00:15:25.925 "num_blocks": 65536, 00:15:25.925 "uuid": "66604fbc-01e8-465c-8f31-863a6db6acd0", 00:15:25.925 "assigned_rate_limits": { 00:15:25.925 "rw_ios_per_sec": 0, 00:15:25.925 "rw_mbytes_per_sec": 0, 00:15:25.925 "r_mbytes_per_sec": 0, 00:15:25.925 "w_mbytes_per_sec": 0 00:15:25.925 }, 00:15:25.925 "claimed": true, 00:15:25.925 "claim_type": "exclusive_write", 00:15:25.925 "zoned": false, 00:15:25.925 "supported_io_types": { 00:15:25.925 "read": true, 00:15:25.925 "write": true, 00:15:25.925 "unmap": true, 00:15:25.925 "flush": true, 00:15:25.925 "reset": true, 00:15:25.925 "nvme_admin": false, 00:15:25.925 "nvme_io": false, 00:15:25.925 "nvme_io_md": false, 00:15:25.925 "write_zeroes": true, 00:15:25.925 "zcopy": true, 00:15:25.925 "get_zone_info": false, 00:15:25.925 "zone_management": false, 00:15:25.925 "zone_append": false, 00:15:25.925 "compare": false, 00:15:25.925 "compare_and_write": false, 00:15:25.925 "abort": true, 00:15:25.925 "seek_hole": false, 00:15:25.925 "seek_data": false, 00:15:25.925 "copy": true, 00:15:25.925 "nvme_iov_md": false 00:15:25.925 }, 00:15:25.925 "memory_domains": [ 00:15:25.925 { 00:15:25.925 "dma_device_id": "system", 00:15:25.925 "dma_device_type": 1 00:15:25.925 }, 00:15:25.925 { 00:15:25.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.925 "dma_device_type": 2 00:15:25.925 } 00:15:25.925 ], 00:15:25.925 "driver_specific": {} 00:15:25.925 }' 00:15:25.925 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.925 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.185 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.185 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.185 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.185 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.185 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.185 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.445 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.445 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.445 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.445 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.445 17:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:27.015 [2024-07-15 17:27:38.171043] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:27.015 [2024-07-15 17:27:38.171065] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:27.015 [2024-07-15 17:27:38.171109] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:27.015 [2024-07-15 17:27:38.171313] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:27.015 [2024-07-15 17:27:38.171320] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2012890 name Existed_Raid, state offline 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2795872 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2795872 ']' 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2795872 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2795872 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2795872' 00:15:27.015 killing process with pid 2795872 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2795872 00:15:27.015 [2024-07-15 17:27:38.255044] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:27.015 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2795872 00:15:27.015 [2024-07-15 17:27:38.269838] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:27.276 17:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:27.276 00:15:27.276 real 0m25.795s 00:15:27.276 user 0m48.516s 00:15:27.276 sys 0m3.664s 00:15:27.276 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:27.276 17:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:27.276 ************************************ 00:15:27.276 END TEST raid_state_function_test_sb 00:15:27.276 ************************************ 00:15:27.276 17:27:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:27.276 17:27:38 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:15:27.276 17:27:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:27.276 17:27:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:27.276 17:27:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:27.276 ************************************ 00:15:27.276 START TEST raid_superblock_test 00:15:27.276 ************************************ 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2800888 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2800888 /var/tmp/spdk-raid.sock 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2800888 ']' 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:27.276 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:27.276 17:27:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.276 [2024-07-15 17:27:38.522036] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:15:27.277 [2024-07-15 17:27:38.522099] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2800888 ] 00:15:27.536 [2024-07-15 17:27:38.617900] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:27.536 [2024-07-15 17:27:38.685360] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:27.536 [2024-07-15 17:27:38.729378] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:27.536 [2024-07-15 17:27:38.729402] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:28.105 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:28.364 malloc1 00:15:28.364 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:28.624 [2024-07-15 17:27:39.711490] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:28.624 [2024-07-15 17:27:39.711527] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:28.624 [2024-07-15 17:27:39.711539] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fba20 00:15:28.624 [2024-07-15 17:27:39.711546] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:28.624 [2024-07-15 17:27:39.712852] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:28.624 [2024-07-15 17:27:39.712870] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:28.624 pt1 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:28.624 malloc2 00:15:28.624 17:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:28.884 [2024-07-15 17:27:40.086747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:28.884 [2024-07-15 17:27:40.086784] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:28.884 [2024-07-15 17:27:40.086797] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fc040 00:15:28.884 [2024-07-15 17:27:40.086803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:28.884 [2024-07-15 17:27:40.088053] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:28.884 [2024-07-15 17:27:40.088072] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:28.884 pt2 00:15:28.884 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:28.884 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:28.884 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:28.884 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:28.884 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:28.884 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:28.884 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:28.884 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:28.884 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:29.143 malloc3 00:15:29.143 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:29.403 [2024-07-15 17:27:40.473511] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:29.403 [2024-07-15 17:27:40.473540] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:29.403 [2024-07-15 17:27:40.473549] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fc540 00:15:29.403 [2024-07-15 17:27:40.473555] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:29.403 [2024-07-15 17:27:40.474762] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:29.403 [2024-07-15 17:27:40.474780] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:29.403 pt3 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:29.403 [2024-07-15 17:27:40.666006] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:29.403 [2024-07-15 17:27:40.667023] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:29.403 [2024-07-15 17:27:40.667064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:29.403 [2024-07-15 17:27:40.667179] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1aa8a90 00:15:29.403 [2024-07-15 17:27:40.667186] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:29.403 [2024-07-15 17:27:40.667335] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1aa4c50 00:15:29.403 [2024-07-15 17:27:40.667446] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1aa8a90 00:15:29.403 [2024-07-15 17:27:40.667452] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1aa8a90 00:15:29.403 [2024-07-15 17:27:40.667520] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.403 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:29.663 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.663 "name": "raid_bdev1", 00:15:29.663 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:29.663 "strip_size_kb": 0, 00:15:29.663 "state": "online", 00:15:29.663 "raid_level": "raid1", 00:15:29.663 "superblock": true, 00:15:29.663 "num_base_bdevs": 3, 00:15:29.663 "num_base_bdevs_discovered": 3, 00:15:29.663 "num_base_bdevs_operational": 3, 00:15:29.663 "base_bdevs_list": [ 00:15:29.663 { 00:15:29.663 "name": "pt1", 00:15:29.663 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:29.663 "is_configured": true, 00:15:29.663 "data_offset": 2048, 00:15:29.663 "data_size": 63488 00:15:29.663 }, 00:15:29.663 { 00:15:29.663 "name": "pt2", 00:15:29.663 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:29.663 "is_configured": true, 00:15:29.663 "data_offset": 2048, 00:15:29.663 "data_size": 63488 00:15:29.663 }, 00:15:29.663 { 00:15:29.663 "name": "pt3", 00:15:29.663 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:29.663 "is_configured": true, 00:15:29.663 "data_offset": 2048, 00:15:29.663 "data_size": 63488 00:15:29.663 } 00:15:29.663 ] 00:15:29.663 }' 00:15:29.663 17:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.663 17:27:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.232 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:30.232 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:30.232 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:30.232 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:30.232 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:30.232 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:30.232 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:30.232 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:30.510 [2024-07-15 17:27:41.572495] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:30.510 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:30.510 "name": "raid_bdev1", 00:15:30.510 "aliases": [ 00:15:30.510 "b309ecdc-f6aa-4ebd-a653-6d0a756f025d" 00:15:30.510 ], 00:15:30.510 "product_name": "Raid Volume", 00:15:30.510 "block_size": 512, 00:15:30.510 "num_blocks": 63488, 00:15:30.510 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:30.510 "assigned_rate_limits": { 00:15:30.510 "rw_ios_per_sec": 0, 00:15:30.510 "rw_mbytes_per_sec": 0, 00:15:30.510 "r_mbytes_per_sec": 0, 00:15:30.510 "w_mbytes_per_sec": 0 00:15:30.510 }, 00:15:30.510 "claimed": false, 00:15:30.510 "zoned": false, 00:15:30.510 "supported_io_types": { 00:15:30.510 "read": true, 00:15:30.510 "write": true, 00:15:30.510 "unmap": false, 00:15:30.510 "flush": false, 00:15:30.510 "reset": true, 00:15:30.510 "nvme_admin": false, 00:15:30.510 "nvme_io": false, 00:15:30.510 "nvme_io_md": false, 00:15:30.510 "write_zeroes": true, 00:15:30.510 "zcopy": false, 00:15:30.510 "get_zone_info": false, 00:15:30.510 "zone_management": false, 00:15:30.510 "zone_append": false, 00:15:30.510 "compare": false, 00:15:30.510 "compare_and_write": false, 00:15:30.510 "abort": false, 00:15:30.510 "seek_hole": false, 00:15:30.510 "seek_data": false, 00:15:30.510 "copy": false, 00:15:30.510 "nvme_iov_md": false 00:15:30.510 }, 00:15:30.510 "memory_domains": [ 00:15:30.510 { 00:15:30.510 "dma_device_id": "system", 00:15:30.510 "dma_device_type": 1 00:15:30.510 }, 00:15:30.511 { 00:15:30.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.511 "dma_device_type": 2 00:15:30.511 }, 00:15:30.511 { 00:15:30.511 "dma_device_id": "system", 00:15:30.511 "dma_device_type": 1 00:15:30.511 }, 00:15:30.511 { 00:15:30.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.511 "dma_device_type": 2 00:15:30.511 }, 00:15:30.511 { 00:15:30.511 "dma_device_id": "system", 00:15:30.511 "dma_device_type": 1 00:15:30.511 }, 00:15:30.511 { 00:15:30.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.511 "dma_device_type": 2 00:15:30.511 } 00:15:30.511 ], 00:15:30.511 "driver_specific": { 00:15:30.511 "raid": { 00:15:30.511 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:30.511 "strip_size_kb": 0, 00:15:30.511 "state": "online", 00:15:30.511 "raid_level": "raid1", 00:15:30.511 "superblock": true, 00:15:30.511 "num_base_bdevs": 3, 00:15:30.511 "num_base_bdevs_discovered": 3, 00:15:30.511 "num_base_bdevs_operational": 3, 00:15:30.511 "base_bdevs_list": [ 00:15:30.511 { 00:15:30.511 "name": "pt1", 00:15:30.511 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:30.511 "is_configured": true, 00:15:30.511 "data_offset": 2048, 00:15:30.511 "data_size": 63488 00:15:30.511 }, 00:15:30.511 { 00:15:30.511 "name": "pt2", 00:15:30.511 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:30.511 "is_configured": true, 00:15:30.511 "data_offset": 2048, 00:15:30.511 "data_size": 63488 00:15:30.511 }, 00:15:30.511 { 00:15:30.511 "name": "pt3", 00:15:30.511 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:30.511 "is_configured": true, 00:15:30.511 "data_offset": 2048, 00:15:30.511 "data_size": 63488 00:15:30.511 } 00:15:30.511 ] 00:15:30.511 } 00:15:30.511 } 00:15:30.511 }' 00:15:30.511 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:30.511 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:30.511 pt2 00:15:30.511 pt3' 00:15:30.511 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:30.511 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:30.511 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:30.795 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:30.795 "name": "pt1", 00:15:30.795 "aliases": [ 00:15:30.795 "00000000-0000-0000-0000-000000000001" 00:15:30.795 ], 00:15:30.795 "product_name": "passthru", 00:15:30.795 "block_size": 512, 00:15:30.795 "num_blocks": 65536, 00:15:30.795 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:30.795 "assigned_rate_limits": { 00:15:30.795 "rw_ios_per_sec": 0, 00:15:30.795 "rw_mbytes_per_sec": 0, 00:15:30.795 "r_mbytes_per_sec": 0, 00:15:30.795 "w_mbytes_per_sec": 0 00:15:30.795 }, 00:15:30.795 "claimed": true, 00:15:30.795 "claim_type": "exclusive_write", 00:15:30.795 "zoned": false, 00:15:30.795 "supported_io_types": { 00:15:30.795 "read": true, 00:15:30.795 "write": true, 00:15:30.795 "unmap": true, 00:15:30.795 "flush": true, 00:15:30.795 "reset": true, 00:15:30.795 "nvme_admin": false, 00:15:30.795 "nvme_io": false, 00:15:30.795 "nvme_io_md": false, 00:15:30.795 "write_zeroes": true, 00:15:30.795 "zcopy": true, 00:15:30.795 "get_zone_info": false, 00:15:30.795 "zone_management": false, 00:15:30.795 "zone_append": false, 00:15:30.795 "compare": false, 00:15:30.795 "compare_and_write": false, 00:15:30.795 "abort": true, 00:15:30.795 "seek_hole": false, 00:15:30.795 "seek_data": false, 00:15:30.795 "copy": true, 00:15:30.795 "nvme_iov_md": false 00:15:30.795 }, 00:15:30.795 "memory_domains": [ 00:15:30.795 { 00:15:30.795 "dma_device_id": "system", 00:15:30.795 "dma_device_type": 1 00:15:30.795 }, 00:15:30.795 { 00:15:30.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.795 "dma_device_type": 2 00:15:30.795 } 00:15:30.795 ], 00:15:30.795 "driver_specific": { 00:15:30.795 "passthru": { 00:15:30.795 "name": "pt1", 00:15:30.795 "base_bdev_name": "malloc1" 00:15:30.795 } 00:15:30.795 } 00:15:30.795 }' 00:15:30.795 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:30.795 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:30.795 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:30.795 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:30.795 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:30.795 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:30.795 17:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:30.795 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.054 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:31.054 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.054 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.054 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:31.054 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.054 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:31.054 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:31.312 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:31.312 "name": "pt2", 00:15:31.312 "aliases": [ 00:15:31.312 "00000000-0000-0000-0000-000000000002" 00:15:31.312 ], 00:15:31.312 "product_name": "passthru", 00:15:31.312 "block_size": 512, 00:15:31.312 "num_blocks": 65536, 00:15:31.312 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:31.312 "assigned_rate_limits": { 00:15:31.312 "rw_ios_per_sec": 0, 00:15:31.312 "rw_mbytes_per_sec": 0, 00:15:31.312 "r_mbytes_per_sec": 0, 00:15:31.312 "w_mbytes_per_sec": 0 00:15:31.312 }, 00:15:31.312 "claimed": true, 00:15:31.312 "claim_type": "exclusive_write", 00:15:31.312 "zoned": false, 00:15:31.312 "supported_io_types": { 00:15:31.312 "read": true, 00:15:31.312 "write": true, 00:15:31.312 "unmap": true, 00:15:31.312 "flush": true, 00:15:31.312 "reset": true, 00:15:31.312 "nvme_admin": false, 00:15:31.312 "nvme_io": false, 00:15:31.312 "nvme_io_md": false, 00:15:31.312 "write_zeroes": true, 00:15:31.312 "zcopy": true, 00:15:31.312 "get_zone_info": false, 00:15:31.312 "zone_management": false, 00:15:31.312 "zone_append": false, 00:15:31.312 "compare": false, 00:15:31.312 "compare_and_write": false, 00:15:31.312 "abort": true, 00:15:31.312 "seek_hole": false, 00:15:31.312 "seek_data": false, 00:15:31.312 "copy": true, 00:15:31.312 "nvme_iov_md": false 00:15:31.312 }, 00:15:31.312 "memory_domains": [ 00:15:31.312 { 00:15:31.312 "dma_device_id": "system", 00:15:31.312 "dma_device_type": 1 00:15:31.312 }, 00:15:31.312 { 00:15:31.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.312 "dma_device_type": 2 00:15:31.312 } 00:15:31.312 ], 00:15:31.312 "driver_specific": { 00:15:31.312 "passthru": { 00:15:31.312 "name": "pt2", 00:15:31.312 "base_bdev_name": "malloc2" 00:15:31.312 } 00:15:31.312 } 00:15:31.312 }' 00:15:31.312 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.312 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.312 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:31.312 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.312 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.312 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:31.312 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.312 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.571 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:31.571 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.571 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.571 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:31.571 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.571 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:31.571 17:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.137 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.137 "name": "pt3", 00:15:32.137 "aliases": [ 00:15:32.137 "00000000-0000-0000-0000-000000000003" 00:15:32.137 ], 00:15:32.137 "product_name": "passthru", 00:15:32.137 "block_size": 512, 00:15:32.137 "num_blocks": 65536, 00:15:32.137 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:32.137 "assigned_rate_limits": { 00:15:32.137 "rw_ios_per_sec": 0, 00:15:32.137 "rw_mbytes_per_sec": 0, 00:15:32.137 "r_mbytes_per_sec": 0, 00:15:32.137 "w_mbytes_per_sec": 0 00:15:32.137 }, 00:15:32.137 "claimed": true, 00:15:32.138 "claim_type": "exclusive_write", 00:15:32.138 "zoned": false, 00:15:32.138 "supported_io_types": { 00:15:32.138 "read": true, 00:15:32.138 "write": true, 00:15:32.138 "unmap": true, 00:15:32.138 "flush": true, 00:15:32.138 "reset": true, 00:15:32.138 "nvme_admin": false, 00:15:32.138 "nvme_io": false, 00:15:32.138 "nvme_io_md": false, 00:15:32.138 "write_zeroes": true, 00:15:32.138 "zcopy": true, 00:15:32.138 "get_zone_info": false, 00:15:32.138 "zone_management": false, 00:15:32.138 "zone_append": false, 00:15:32.138 "compare": false, 00:15:32.138 "compare_and_write": false, 00:15:32.138 "abort": true, 00:15:32.138 "seek_hole": false, 00:15:32.138 "seek_data": false, 00:15:32.138 "copy": true, 00:15:32.138 "nvme_iov_md": false 00:15:32.138 }, 00:15:32.138 "memory_domains": [ 00:15:32.138 { 00:15:32.138 "dma_device_id": "system", 00:15:32.138 "dma_device_type": 1 00:15:32.138 }, 00:15:32.138 { 00:15:32.138 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.138 "dma_device_type": 2 00:15:32.138 } 00:15:32.138 ], 00:15:32.138 "driver_specific": { 00:15:32.138 "passthru": { 00:15:32.138 "name": "pt3", 00:15:32.138 "base_bdev_name": "malloc3" 00:15:32.138 } 00:15:32.138 } 00:15:32.138 }' 00:15:32.138 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.138 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.138 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.138 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:32.396 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:32.655 [2024-07-15 17:27:43.834227] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:32.655 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b309ecdc-f6aa-4ebd-a653-6d0a756f025d 00:15:32.655 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b309ecdc-f6aa-4ebd-a653-6d0a756f025d ']' 00:15:32.655 17:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:32.913 [2024-07-15 17:27:44.030500] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:32.913 [2024-07-15 17:27:44.030512] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:32.913 [2024-07-15 17:27:44.030545] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:32.913 [2024-07-15 17:27:44.030597] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:32.913 [2024-07-15 17:27:44.030603] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aa8a90 name raid_bdev1, state offline 00:15:32.913 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:32.913 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.173 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:33.173 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:33.173 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:33.173 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:33.173 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:33.173 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:33.432 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:33.432 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:33.692 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:33.692 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:33.952 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:33.952 17:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:33.952 17:27:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:33.952 17:27:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:33.952 [2024-07-15 17:27:45.209435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:33.952 [2024-07-15 17:27:45.210494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:33.952 [2024-07-15 17:27:45.210527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:33.952 [2024-07-15 17:27:45.210560] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:33.952 [2024-07-15 17:27:45.210585] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:33.952 [2024-07-15 17:27:45.210599] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:33.952 [2024-07-15 17:27:45.210609] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:33.952 [2024-07-15 17:27:45.210615] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aa4bf0 name raid_bdev1, state configuring 00:15:33.952 request: 00:15:33.952 { 00:15:33.952 "name": "raid_bdev1", 00:15:33.952 "raid_level": "raid1", 00:15:33.952 "base_bdevs": [ 00:15:33.952 "malloc1", 00:15:33.952 "malloc2", 00:15:33.952 "malloc3" 00:15:33.952 ], 00:15:33.952 "superblock": false, 00:15:33.952 "method": "bdev_raid_create", 00:15:33.952 "req_id": 1 00:15:33.952 } 00:15:33.952 Got JSON-RPC error response 00:15:33.952 response: 00:15:33.952 { 00:15:33.952 "code": -17, 00:15:33.952 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:33.952 } 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.952 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:34.211 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:34.211 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:34.211 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:34.472 [2024-07-15 17:27:45.622432] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:34.472 [2024-07-15 17:27:45.622456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:34.472 [2024-07-15 17:27:45.622465] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fce00 00:15:34.472 [2024-07-15 17:27:45.622472] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:34.472 [2024-07-15 17:27:45.623721] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:34.472 [2024-07-15 17:27:45.623739] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:34.472 [2024-07-15 17:27:45.623785] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:34.472 [2024-07-15 17:27:45.623803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:34.472 pt1 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.472 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:34.732 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.732 "name": "raid_bdev1", 00:15:34.732 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:34.732 "strip_size_kb": 0, 00:15:34.732 "state": "configuring", 00:15:34.732 "raid_level": "raid1", 00:15:34.732 "superblock": true, 00:15:34.732 "num_base_bdevs": 3, 00:15:34.732 "num_base_bdevs_discovered": 1, 00:15:34.732 "num_base_bdevs_operational": 3, 00:15:34.732 "base_bdevs_list": [ 00:15:34.732 { 00:15:34.732 "name": "pt1", 00:15:34.732 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:34.732 "is_configured": true, 00:15:34.732 "data_offset": 2048, 00:15:34.732 "data_size": 63488 00:15:34.732 }, 00:15:34.732 { 00:15:34.732 "name": null, 00:15:34.732 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:34.732 "is_configured": false, 00:15:34.732 "data_offset": 2048, 00:15:34.732 "data_size": 63488 00:15:34.732 }, 00:15:34.732 { 00:15:34.732 "name": null, 00:15:34.732 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:34.732 "is_configured": false, 00:15:34.732 "data_offset": 2048, 00:15:34.732 "data_size": 63488 00:15:34.732 } 00:15:34.732 ] 00:15:34.732 }' 00:15:34.732 17:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.732 17:27:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.302 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:35.302 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:35.302 [2024-07-15 17:27:46.584866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:35.302 [2024-07-15 17:27:46.584893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:35.302 [2024-07-15 17:27:46.584904] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fbc50 00:15:35.302 [2024-07-15 17:27:46.584910] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:35.302 [2024-07-15 17:27:46.585153] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:35.302 [2024-07-15 17:27:46.585162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:35.302 [2024-07-15 17:27:46.585197] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:35.302 [2024-07-15 17:27:46.585208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:35.302 pt2 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:35.562 [2024-07-15 17:27:46.769339] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.562 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:35.821 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.821 "name": "raid_bdev1", 00:15:35.821 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:35.821 "strip_size_kb": 0, 00:15:35.821 "state": "configuring", 00:15:35.821 "raid_level": "raid1", 00:15:35.821 "superblock": true, 00:15:35.821 "num_base_bdevs": 3, 00:15:35.821 "num_base_bdevs_discovered": 1, 00:15:35.821 "num_base_bdevs_operational": 3, 00:15:35.821 "base_bdevs_list": [ 00:15:35.821 { 00:15:35.821 "name": "pt1", 00:15:35.821 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:35.821 "is_configured": true, 00:15:35.821 "data_offset": 2048, 00:15:35.821 "data_size": 63488 00:15:35.821 }, 00:15:35.821 { 00:15:35.821 "name": null, 00:15:35.821 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:35.821 "is_configured": false, 00:15:35.821 "data_offset": 2048, 00:15:35.821 "data_size": 63488 00:15:35.821 }, 00:15:35.821 { 00:15:35.821 "name": null, 00:15:35.821 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:35.821 "is_configured": false, 00:15:35.821 "data_offset": 2048, 00:15:35.821 "data_size": 63488 00:15:35.821 } 00:15:35.821 ] 00:15:35.821 }' 00:15:35.821 17:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.821 17:27:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.390 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:36.390 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:36.390 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:36.650 [2024-07-15 17:27:47.719745] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:36.650 [2024-07-15 17:27:47.719771] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.650 [2024-07-15 17:27:47.719782] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aaa880 00:15:36.650 [2024-07-15 17:27:47.719788] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.650 [2024-07-15 17:27:47.720036] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.650 [2024-07-15 17:27:47.720046] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:36.650 [2024-07-15 17:27:47.720085] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:36.650 [2024-07-15 17:27:47.720096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:36.650 pt2 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:36.650 [2024-07-15 17:27:47.904205] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:36.650 [2024-07-15 17:27:47.904221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.650 [2024-07-15 17:27:47.904229] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aa6920 00:15:36.650 [2024-07-15 17:27:47.904235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.650 [2024-07-15 17:27:47.904446] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.650 [2024-07-15 17:27:47.904455] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:36.650 [2024-07-15 17:27:47.904487] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:36.650 [2024-07-15 17:27:47.904497] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:36.650 [2024-07-15 17:27:47.904574] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1aa6100 00:15:36.650 [2024-07-15 17:27:47.904585] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:36.650 [2024-07-15 17:27:47.904727] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1aad620 00:15:36.650 [2024-07-15 17:27:47.904831] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1aa6100 00:15:36.650 [2024-07-15 17:27:47.904836] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1aa6100 00:15:36.650 [2024-07-15 17:27:47.904906] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:36.650 pt3 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.650 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.651 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.651 17:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:36.911 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.911 "name": "raid_bdev1", 00:15:36.911 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:36.911 "strip_size_kb": 0, 00:15:36.911 "state": "online", 00:15:36.911 "raid_level": "raid1", 00:15:36.911 "superblock": true, 00:15:36.911 "num_base_bdevs": 3, 00:15:36.911 "num_base_bdevs_discovered": 3, 00:15:36.911 "num_base_bdevs_operational": 3, 00:15:36.911 "base_bdevs_list": [ 00:15:36.911 { 00:15:36.911 "name": "pt1", 00:15:36.911 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:36.911 "is_configured": true, 00:15:36.911 "data_offset": 2048, 00:15:36.911 "data_size": 63488 00:15:36.911 }, 00:15:36.911 { 00:15:36.911 "name": "pt2", 00:15:36.911 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:36.911 "is_configured": true, 00:15:36.911 "data_offset": 2048, 00:15:36.911 "data_size": 63488 00:15:36.911 }, 00:15:36.911 { 00:15:36.911 "name": "pt3", 00:15:36.911 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:36.911 "is_configured": true, 00:15:36.911 "data_offset": 2048, 00:15:36.911 "data_size": 63488 00:15:36.911 } 00:15:36.911 ] 00:15:36.911 }' 00:15:36.911 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.911 17:27:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.852 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:37.852 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:37.852 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:37.852 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:37.852 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:37.852 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:37.852 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:37.852 17:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:38.112 [2024-07-15 17:27:49.372201] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.112 17:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:38.112 "name": "raid_bdev1", 00:15:38.112 "aliases": [ 00:15:38.112 "b309ecdc-f6aa-4ebd-a653-6d0a756f025d" 00:15:38.112 ], 00:15:38.112 "product_name": "Raid Volume", 00:15:38.112 "block_size": 512, 00:15:38.112 "num_blocks": 63488, 00:15:38.112 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:38.112 "assigned_rate_limits": { 00:15:38.112 "rw_ios_per_sec": 0, 00:15:38.112 "rw_mbytes_per_sec": 0, 00:15:38.112 "r_mbytes_per_sec": 0, 00:15:38.112 "w_mbytes_per_sec": 0 00:15:38.112 }, 00:15:38.112 "claimed": false, 00:15:38.112 "zoned": false, 00:15:38.112 "supported_io_types": { 00:15:38.112 "read": true, 00:15:38.112 "write": true, 00:15:38.112 "unmap": false, 00:15:38.112 "flush": false, 00:15:38.112 "reset": true, 00:15:38.112 "nvme_admin": false, 00:15:38.112 "nvme_io": false, 00:15:38.112 "nvme_io_md": false, 00:15:38.112 "write_zeroes": true, 00:15:38.112 "zcopy": false, 00:15:38.112 "get_zone_info": false, 00:15:38.112 "zone_management": false, 00:15:38.112 "zone_append": false, 00:15:38.112 "compare": false, 00:15:38.112 "compare_and_write": false, 00:15:38.112 "abort": false, 00:15:38.112 "seek_hole": false, 00:15:38.112 "seek_data": false, 00:15:38.112 "copy": false, 00:15:38.112 "nvme_iov_md": false 00:15:38.112 }, 00:15:38.112 "memory_domains": [ 00:15:38.112 { 00:15:38.112 "dma_device_id": "system", 00:15:38.112 "dma_device_type": 1 00:15:38.112 }, 00:15:38.112 { 00:15:38.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.112 "dma_device_type": 2 00:15:38.112 }, 00:15:38.112 { 00:15:38.112 "dma_device_id": "system", 00:15:38.112 "dma_device_type": 1 00:15:38.112 }, 00:15:38.112 { 00:15:38.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.112 "dma_device_type": 2 00:15:38.112 }, 00:15:38.112 { 00:15:38.112 "dma_device_id": "system", 00:15:38.112 "dma_device_type": 1 00:15:38.112 }, 00:15:38.112 { 00:15:38.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.112 "dma_device_type": 2 00:15:38.112 } 00:15:38.112 ], 00:15:38.112 "driver_specific": { 00:15:38.112 "raid": { 00:15:38.112 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:38.112 "strip_size_kb": 0, 00:15:38.112 "state": "online", 00:15:38.112 "raid_level": "raid1", 00:15:38.112 "superblock": true, 00:15:38.112 "num_base_bdevs": 3, 00:15:38.112 "num_base_bdevs_discovered": 3, 00:15:38.112 "num_base_bdevs_operational": 3, 00:15:38.112 "base_bdevs_list": [ 00:15:38.112 { 00:15:38.112 "name": "pt1", 00:15:38.112 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.112 "is_configured": true, 00:15:38.112 "data_offset": 2048, 00:15:38.112 "data_size": 63488 00:15:38.112 }, 00:15:38.112 { 00:15:38.112 "name": "pt2", 00:15:38.112 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.112 "is_configured": true, 00:15:38.112 "data_offset": 2048, 00:15:38.112 "data_size": 63488 00:15:38.112 }, 00:15:38.112 { 00:15:38.112 "name": "pt3", 00:15:38.112 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:38.112 "is_configured": true, 00:15:38.112 "data_offset": 2048, 00:15:38.112 "data_size": 63488 00:15:38.112 } 00:15:38.112 ] 00:15:38.112 } 00:15:38.112 } 00:15:38.112 }' 00:15:38.112 17:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.372 17:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:38.372 pt2 00:15:38.372 pt3' 00:15:38.372 17:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.372 17:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:38.372 17:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.941 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.941 "name": "pt1", 00:15:38.941 "aliases": [ 00:15:38.942 "00000000-0000-0000-0000-000000000001" 00:15:38.942 ], 00:15:38.942 "product_name": "passthru", 00:15:38.942 "block_size": 512, 00:15:38.942 "num_blocks": 65536, 00:15:38.942 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.942 "assigned_rate_limits": { 00:15:38.942 "rw_ios_per_sec": 0, 00:15:38.942 "rw_mbytes_per_sec": 0, 00:15:38.942 "r_mbytes_per_sec": 0, 00:15:38.942 "w_mbytes_per_sec": 0 00:15:38.942 }, 00:15:38.942 "claimed": true, 00:15:38.942 "claim_type": "exclusive_write", 00:15:38.942 "zoned": false, 00:15:38.942 "supported_io_types": { 00:15:38.942 "read": true, 00:15:38.942 "write": true, 00:15:38.942 "unmap": true, 00:15:38.942 "flush": true, 00:15:38.942 "reset": true, 00:15:38.942 "nvme_admin": false, 00:15:38.942 "nvme_io": false, 00:15:38.942 "nvme_io_md": false, 00:15:38.942 "write_zeroes": true, 00:15:38.942 "zcopy": true, 00:15:38.942 "get_zone_info": false, 00:15:38.942 "zone_management": false, 00:15:38.942 "zone_append": false, 00:15:38.942 "compare": false, 00:15:38.942 "compare_and_write": false, 00:15:38.942 "abort": true, 00:15:38.942 "seek_hole": false, 00:15:38.942 "seek_data": false, 00:15:38.942 "copy": true, 00:15:38.942 "nvme_iov_md": false 00:15:38.942 }, 00:15:38.942 "memory_domains": [ 00:15:38.942 { 00:15:38.942 "dma_device_id": "system", 00:15:38.942 "dma_device_type": 1 00:15:38.942 }, 00:15:38.942 { 00:15:38.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.942 "dma_device_type": 2 00:15:38.942 } 00:15:38.942 ], 00:15:38.942 "driver_specific": { 00:15:38.942 "passthru": { 00:15:38.942 "name": "pt1", 00:15:38.942 "base_bdev_name": "malloc1" 00:15:38.942 } 00:15:38.942 } 00:15:38.942 }' 00:15:38.942 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.942 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.942 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.942 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.942 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.942 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:38.942 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.202 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.202 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.202 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.202 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.202 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.202 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.202 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:39.202 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.461 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.461 "name": "pt2", 00:15:39.461 "aliases": [ 00:15:39.461 "00000000-0000-0000-0000-000000000002" 00:15:39.461 ], 00:15:39.461 "product_name": "passthru", 00:15:39.461 "block_size": 512, 00:15:39.461 "num_blocks": 65536, 00:15:39.461 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:39.461 "assigned_rate_limits": { 00:15:39.461 "rw_ios_per_sec": 0, 00:15:39.461 "rw_mbytes_per_sec": 0, 00:15:39.461 "r_mbytes_per_sec": 0, 00:15:39.461 "w_mbytes_per_sec": 0 00:15:39.461 }, 00:15:39.461 "claimed": true, 00:15:39.461 "claim_type": "exclusive_write", 00:15:39.461 "zoned": false, 00:15:39.461 "supported_io_types": { 00:15:39.461 "read": true, 00:15:39.461 "write": true, 00:15:39.461 "unmap": true, 00:15:39.461 "flush": true, 00:15:39.461 "reset": true, 00:15:39.461 "nvme_admin": false, 00:15:39.461 "nvme_io": false, 00:15:39.461 "nvme_io_md": false, 00:15:39.461 "write_zeroes": true, 00:15:39.461 "zcopy": true, 00:15:39.461 "get_zone_info": false, 00:15:39.461 "zone_management": false, 00:15:39.461 "zone_append": false, 00:15:39.461 "compare": false, 00:15:39.461 "compare_and_write": false, 00:15:39.461 "abort": true, 00:15:39.461 "seek_hole": false, 00:15:39.461 "seek_data": false, 00:15:39.461 "copy": true, 00:15:39.461 "nvme_iov_md": false 00:15:39.461 }, 00:15:39.461 "memory_domains": [ 00:15:39.461 { 00:15:39.461 "dma_device_id": "system", 00:15:39.461 "dma_device_type": 1 00:15:39.461 }, 00:15:39.461 { 00:15:39.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.461 "dma_device_type": 2 00:15:39.461 } 00:15:39.461 ], 00:15:39.461 "driver_specific": { 00:15:39.461 "passthru": { 00:15:39.461 "name": "pt2", 00:15:39.461 "base_bdev_name": "malloc2" 00:15:39.461 } 00:15:39.461 } 00:15:39.461 }' 00:15:39.461 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.461 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.461 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.461 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.461 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.461 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.461 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.721 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.721 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.721 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.721 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.721 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.721 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.721 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:39.721 17:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.981 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.981 "name": "pt3", 00:15:39.981 "aliases": [ 00:15:39.981 "00000000-0000-0000-0000-000000000003" 00:15:39.981 ], 00:15:39.981 "product_name": "passthru", 00:15:39.981 "block_size": 512, 00:15:39.981 "num_blocks": 65536, 00:15:39.981 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:39.981 "assigned_rate_limits": { 00:15:39.981 "rw_ios_per_sec": 0, 00:15:39.981 "rw_mbytes_per_sec": 0, 00:15:39.981 "r_mbytes_per_sec": 0, 00:15:39.981 "w_mbytes_per_sec": 0 00:15:39.981 }, 00:15:39.981 "claimed": true, 00:15:39.981 "claim_type": "exclusive_write", 00:15:39.981 "zoned": false, 00:15:39.981 "supported_io_types": { 00:15:39.981 "read": true, 00:15:39.981 "write": true, 00:15:39.981 "unmap": true, 00:15:39.981 "flush": true, 00:15:39.981 "reset": true, 00:15:39.981 "nvme_admin": false, 00:15:39.981 "nvme_io": false, 00:15:39.981 "nvme_io_md": false, 00:15:39.981 "write_zeroes": true, 00:15:39.981 "zcopy": true, 00:15:39.981 "get_zone_info": false, 00:15:39.981 "zone_management": false, 00:15:39.981 "zone_append": false, 00:15:39.981 "compare": false, 00:15:39.981 "compare_and_write": false, 00:15:39.981 "abort": true, 00:15:39.981 "seek_hole": false, 00:15:39.981 "seek_data": false, 00:15:39.981 "copy": true, 00:15:39.981 "nvme_iov_md": false 00:15:39.981 }, 00:15:39.981 "memory_domains": [ 00:15:39.981 { 00:15:39.981 "dma_device_id": "system", 00:15:39.981 "dma_device_type": 1 00:15:39.981 }, 00:15:39.981 { 00:15:39.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.981 "dma_device_type": 2 00:15:39.981 } 00:15:39.981 ], 00:15:39.981 "driver_specific": { 00:15:39.981 "passthru": { 00:15:39.981 "name": "pt3", 00:15:39.981 "base_bdev_name": "malloc3" 00:15:39.981 } 00:15:39.981 } 00:15:39.981 }' 00:15:39.981 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.981 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.981 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.981 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.981 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.242 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.242 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.242 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.242 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.242 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.242 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.242 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.242 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:40.242 17:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:40.811 [2024-07-15 17:27:51.994832] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:40.811 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b309ecdc-f6aa-4ebd-a653-6d0a756f025d '!=' b309ecdc-f6aa-4ebd-a653-6d0a756f025d ']' 00:15:40.811 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:15:40.811 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:40.811 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:40.811 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:41.379 [2024-07-15 17:27:52.540007] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.379 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:41.639 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.639 "name": "raid_bdev1", 00:15:41.639 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:41.639 "strip_size_kb": 0, 00:15:41.639 "state": "online", 00:15:41.639 "raid_level": "raid1", 00:15:41.639 "superblock": true, 00:15:41.639 "num_base_bdevs": 3, 00:15:41.639 "num_base_bdevs_discovered": 2, 00:15:41.639 "num_base_bdevs_operational": 2, 00:15:41.639 "base_bdevs_list": [ 00:15:41.639 { 00:15:41.639 "name": null, 00:15:41.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:41.639 "is_configured": false, 00:15:41.639 "data_offset": 2048, 00:15:41.639 "data_size": 63488 00:15:41.639 }, 00:15:41.639 { 00:15:41.639 "name": "pt2", 00:15:41.639 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:41.639 "is_configured": true, 00:15:41.639 "data_offset": 2048, 00:15:41.639 "data_size": 63488 00:15:41.639 }, 00:15:41.639 { 00:15:41.639 "name": "pt3", 00:15:41.639 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:41.639 "is_configured": true, 00:15:41.639 "data_offset": 2048, 00:15:41.639 "data_size": 63488 00:15:41.639 } 00:15:41.639 ] 00:15:41.639 }' 00:15:41.639 17:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.639 17:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.209 17:27:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:42.779 [2024-07-15 17:27:53.811216] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:42.779 [2024-07-15 17:27:53.811231] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:42.779 [2024-07-15 17:27:53.811264] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:42.779 [2024-07-15 17:27:53.811302] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:42.779 [2024-07-15 17:27:53.811307] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aa6100 name raid_bdev1, state offline 00:15:42.779 17:27:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.779 17:27:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:15:42.779 17:27:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:15:42.779 17:27:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:15:42.779 17:27:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:15:42.779 17:27:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:15:42.779 17:27:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:43.350 17:27:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:15:43.350 17:27:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:15:43.350 17:27:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:43.921 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:15:43.921 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:15:43.921 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:15:43.921 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:15:43.921 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:44.182 [2024-07-15 17:27:55.290876] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:44.182 [2024-07-15 17:27:55.290904] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:44.182 [2024-07-15 17:27:55.290914] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aad420 00:15:44.182 [2024-07-15 17:27:55.290921] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:44.182 [2024-07-15 17:27:55.292192] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:44.182 [2024-07-15 17:27:55.292212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:44.182 [2024-07-15 17:27:55.292255] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:44.182 [2024-07-15 17:27:55.292274] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:44.182 pt2 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.182 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:44.442 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.442 "name": "raid_bdev1", 00:15:44.442 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:44.442 "strip_size_kb": 0, 00:15:44.442 "state": "configuring", 00:15:44.442 "raid_level": "raid1", 00:15:44.442 "superblock": true, 00:15:44.442 "num_base_bdevs": 3, 00:15:44.442 "num_base_bdevs_discovered": 1, 00:15:44.442 "num_base_bdevs_operational": 2, 00:15:44.442 "base_bdevs_list": [ 00:15:44.442 { 00:15:44.442 "name": null, 00:15:44.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.442 "is_configured": false, 00:15:44.442 "data_offset": 2048, 00:15:44.442 "data_size": 63488 00:15:44.442 }, 00:15:44.442 { 00:15:44.442 "name": "pt2", 00:15:44.442 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:44.442 "is_configured": true, 00:15:44.442 "data_offset": 2048, 00:15:44.442 "data_size": 63488 00:15:44.442 }, 00:15:44.442 { 00:15:44.442 "name": null, 00:15:44.442 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:44.442 "is_configured": false, 00:15:44.442 "data_offset": 2048, 00:15:44.442 "data_size": 63488 00:15:44.442 } 00:15:44.442 ] 00:15:44.442 }' 00:15:44.442 17:27:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.442 17:27:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:45.012 [2024-07-15 17:27:56.241287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:45.012 [2024-07-15 17:27:56.241316] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.012 [2024-07-15 17:27:56.241325] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aa5aa0 00:15:45.012 [2024-07-15 17:27:56.241331] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.012 [2024-07-15 17:27:56.241588] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.012 [2024-07-15 17:27:56.241598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:45.012 [2024-07-15 17:27:56.241638] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:45.012 [2024-07-15 17:27:56.241650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:45.012 [2024-07-15 17:27:56.241730] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18fa360 00:15:45.012 [2024-07-15 17:27:56.241737] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:45.012 [2024-07-15 17:27:56.241870] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1aacb60 00:15:45.012 [2024-07-15 17:27:56.241971] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18fa360 00:15:45.012 [2024-07-15 17:27:56.241976] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18fa360 00:15:45.012 [2024-07-15 17:27:56.242048] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:45.012 pt3 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.012 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:45.583 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.583 "name": "raid_bdev1", 00:15:45.583 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:45.583 "strip_size_kb": 0, 00:15:45.583 "state": "online", 00:15:45.583 "raid_level": "raid1", 00:15:45.583 "superblock": true, 00:15:45.583 "num_base_bdevs": 3, 00:15:45.583 "num_base_bdevs_discovered": 2, 00:15:45.583 "num_base_bdevs_operational": 2, 00:15:45.583 "base_bdevs_list": [ 00:15:45.583 { 00:15:45.583 "name": null, 00:15:45.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.583 "is_configured": false, 00:15:45.583 "data_offset": 2048, 00:15:45.583 "data_size": 63488 00:15:45.583 }, 00:15:45.583 { 00:15:45.583 "name": "pt2", 00:15:45.583 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:45.583 "is_configured": true, 00:15:45.583 "data_offset": 2048, 00:15:45.583 "data_size": 63488 00:15:45.583 }, 00:15:45.583 { 00:15:45.583 "name": "pt3", 00:15:45.583 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:45.583 "is_configured": true, 00:15:45.583 "data_offset": 2048, 00:15:45.583 "data_size": 63488 00:15:45.583 } 00:15:45.583 ] 00:15:45.583 }' 00:15:45.583 17:27:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.583 17:27:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.155 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:46.155 [2024-07-15 17:27:57.452353] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:46.155 [2024-07-15 17:27:57.452368] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:46.155 [2024-07-15 17:27:57.452401] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:46.155 [2024-07-15 17:27:57.452440] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:46.155 [2024-07-15 17:27:57.452446] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18fa360 name raid_bdev1, state offline 00:15:46.417 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.417 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:15:46.417 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:15:46.417 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:15:46.417 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:15:46.417 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:15:46.417 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:46.678 [2024-07-15 17:27:57.917513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:46.678 [2024-07-15 17:27:57.917540] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:46.678 [2024-07-15 17:27:57.917549] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aa7330 00:15:46.678 [2024-07-15 17:27:57.917555] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:46.678 [2024-07-15 17:27:57.918821] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:46.678 [2024-07-15 17:27:57.918840] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:46.678 [2024-07-15 17:27:57.918882] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:46.678 [2024-07-15 17:27:57.918900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:46.678 [2024-07-15 17:27:57.918972] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:15:46.678 [2024-07-15 17:27:57.918979] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:46.678 [2024-07-15 17:27:57.918988] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aaac50 name raid_bdev1, state configuring 00:15:46.678 [2024-07-15 17:27:57.919002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:46.678 pt1 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.678 17:27:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:46.938 17:27:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:46.938 "name": "raid_bdev1", 00:15:46.938 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:46.938 "strip_size_kb": 0, 00:15:46.938 "state": "configuring", 00:15:46.938 "raid_level": "raid1", 00:15:46.938 "superblock": true, 00:15:46.938 "num_base_bdevs": 3, 00:15:46.938 "num_base_bdevs_discovered": 1, 00:15:46.938 "num_base_bdevs_operational": 2, 00:15:46.938 "base_bdevs_list": [ 00:15:46.938 { 00:15:46.938 "name": null, 00:15:46.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.938 "is_configured": false, 00:15:46.938 "data_offset": 2048, 00:15:46.938 "data_size": 63488 00:15:46.938 }, 00:15:46.938 { 00:15:46.938 "name": "pt2", 00:15:46.938 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:46.938 "is_configured": true, 00:15:46.938 "data_offset": 2048, 00:15:46.938 "data_size": 63488 00:15:46.938 }, 00:15:46.938 { 00:15:46.938 "name": null, 00:15:46.938 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:46.938 "is_configured": false, 00:15:46.938 "data_offset": 2048, 00:15:46.938 "data_size": 63488 00:15:46.938 } 00:15:46.938 ] 00:15:46.938 }' 00:15:46.938 17:27:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:46.938 17:27:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:47.507 17:27:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:15:47.507 17:27:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:15:47.800 17:27:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:15:47.800 17:27:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:47.800 [2024-07-15 17:27:58.984230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:47.800 [2024-07-15 17:27:58.984257] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.800 [2024-07-15 17:27:58.984266] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fb0f0 00:15:47.800 [2024-07-15 17:27:58.984272] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.800 [2024-07-15 17:27:58.984523] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.800 [2024-07-15 17:27:58.984534] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:47.800 [2024-07-15 17:27:58.984571] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:47.800 [2024-07-15 17:27:58.984584] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:47.800 [2024-07-15 17:27:58.984654] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x198d360 00:15:47.800 [2024-07-15 17:27:58.984660] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:47.800 [2024-07-15 17:27:58.984794] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1aabac0 00:15:47.800 [2024-07-15 17:27:58.984893] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x198d360 00:15:47.800 [2024-07-15 17:27:58.984898] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x198d360 00:15:47.800 [2024-07-15 17:27:58.984968] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:47.800 pt3 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.800 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:48.059 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.059 "name": "raid_bdev1", 00:15:48.059 "uuid": "b309ecdc-f6aa-4ebd-a653-6d0a756f025d", 00:15:48.059 "strip_size_kb": 0, 00:15:48.059 "state": "online", 00:15:48.059 "raid_level": "raid1", 00:15:48.059 "superblock": true, 00:15:48.059 "num_base_bdevs": 3, 00:15:48.059 "num_base_bdevs_discovered": 2, 00:15:48.059 "num_base_bdevs_operational": 2, 00:15:48.059 "base_bdevs_list": [ 00:15:48.059 { 00:15:48.059 "name": null, 00:15:48.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.059 "is_configured": false, 00:15:48.059 "data_offset": 2048, 00:15:48.059 "data_size": 63488 00:15:48.059 }, 00:15:48.059 { 00:15:48.059 "name": "pt2", 00:15:48.059 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:48.059 "is_configured": true, 00:15:48.059 "data_offset": 2048, 00:15:48.059 "data_size": 63488 00:15:48.059 }, 00:15:48.059 { 00:15:48.059 "name": "pt3", 00:15:48.059 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:48.059 "is_configured": true, 00:15:48.059 "data_offset": 2048, 00:15:48.059 "data_size": 63488 00:15:48.059 } 00:15:48.059 ] 00:15:48.059 }' 00:15:48.059 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.059 17:27:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.626 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:15:48.626 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:15:48.626 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:15:48.626 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:48.626 17:27:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:15:48.885 [2024-07-15 17:28:00.099373] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' b309ecdc-f6aa-4ebd-a653-6d0a756f025d '!=' b309ecdc-f6aa-4ebd-a653-6d0a756f025d ']' 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2800888 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2800888 ']' 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2800888 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2800888 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2800888' 00:15:48.885 killing process with pid 2800888 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2800888 00:15:48.885 [2024-07-15 17:28:00.167528] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:48.885 [2024-07-15 17:28:00.167563] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:48.885 [2024-07-15 17:28:00.167602] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:48.885 [2024-07-15 17:28:00.167613] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x198d360 name raid_bdev1, state offline 00:15:48.885 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2800888 00:15:48.885 [2024-07-15 17:28:00.182485] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:49.146 17:28:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:49.146 00:15:49.146 real 0m21.835s 00:15:49.146 user 0m41.079s 00:15:49.146 sys 0m2.963s 00:15:49.146 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:49.146 17:28:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.146 ************************************ 00:15:49.146 END TEST raid_superblock_test 00:15:49.146 ************************************ 00:15:49.146 17:28:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:49.146 17:28:00 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:15:49.146 17:28:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:49.146 17:28:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:49.146 17:28:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:49.146 ************************************ 00:15:49.146 START TEST raid_read_error_test 00:15:49.146 ************************************ 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.mNJkAnrI8T 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2804948 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2804948 /var/tmp/spdk-raid.sock 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2804948 ']' 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:49.146 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:49.146 17:28:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.406 [2024-07-15 17:28:00.443875] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:15:49.406 [2024-07-15 17:28:00.443930] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2804948 ] 00:15:49.406 [2024-07-15 17:28:00.531992] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:49.406 [2024-07-15 17:28:00.600193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:49.406 [2024-07-15 17:28:00.640833] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:49.406 [2024-07-15 17:28:00.640857] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:50.344 17:28:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:50.344 17:28:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:50.344 17:28:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:50.344 17:28:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:50.344 BaseBdev1_malloc 00:15:50.344 17:28:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:50.603 true 00:15:50.603 17:28:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:50.603 [2024-07-15 17:28:01.887607] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:50.603 [2024-07-15 17:28:01.887635] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:50.603 [2024-07-15 17:28:01.887647] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2896b50 00:15:50.604 [2024-07-15 17:28:01.887653] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:50.604 [2024-07-15 17:28:01.888935] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:50.604 [2024-07-15 17:28:01.888954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:50.604 BaseBdev1 00:15:50.864 17:28:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:50.864 17:28:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:50.864 BaseBdev2_malloc 00:15:50.864 17:28:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:51.125 true 00:15:51.125 17:28:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:51.386 [2024-07-15 17:28:02.462846] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:51.386 [2024-07-15 17:28:02.462876] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:51.386 [2024-07-15 17:28:02.462894] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x287aea0 00:15:51.386 [2024-07-15 17:28:02.462900] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:51.386 [2024-07-15 17:28:02.464092] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:51.386 [2024-07-15 17:28:02.464111] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:51.386 BaseBdev2 00:15:51.386 17:28:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:51.386 17:28:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:51.386 BaseBdev3_malloc 00:15:51.646 17:28:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:51.646 true 00:15:51.646 17:28:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:51.908 [2024-07-15 17:28:03.062166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:51.908 [2024-07-15 17:28:03.062193] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:51.908 [2024-07-15 17:28:03.062206] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x287efb0 00:15:51.908 [2024-07-15 17:28:03.062212] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:51.908 [2024-07-15 17:28:03.063392] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:51.908 [2024-07-15 17:28:03.063411] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:51.908 BaseBdev3 00:15:51.908 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:52.170 [2024-07-15 17:28:03.238625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:52.170 [2024-07-15 17:28:03.239621] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:52.170 [2024-07-15 17:28:03.239672] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:52.170 [2024-07-15 17:28:03.239834] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28800e0 00:15:52.170 [2024-07-15 17:28:03.239841] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:52.170 [2024-07-15 17:28:03.239981] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x287d9b0 00:15:52.170 [2024-07-15 17:28:03.240102] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28800e0 00:15:52.170 [2024-07-15 17:28:03.240107] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28800e0 00:15:52.170 [2024-07-15 17:28:03.240183] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:52.170 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.170 "name": "raid_bdev1", 00:15:52.170 "uuid": "cde7e700-d505-4a28-92dc-c8b00c930792", 00:15:52.170 "strip_size_kb": 0, 00:15:52.170 "state": "online", 00:15:52.170 "raid_level": "raid1", 00:15:52.170 "superblock": true, 00:15:52.170 "num_base_bdevs": 3, 00:15:52.170 "num_base_bdevs_discovered": 3, 00:15:52.170 "num_base_bdevs_operational": 3, 00:15:52.170 "base_bdevs_list": [ 00:15:52.170 { 00:15:52.170 "name": "BaseBdev1", 00:15:52.170 "uuid": "7b1ae645-1cfc-5ad8-ba7a-eeb18f05b719", 00:15:52.170 "is_configured": true, 00:15:52.170 "data_offset": 2048, 00:15:52.171 "data_size": 63488 00:15:52.171 }, 00:15:52.171 { 00:15:52.171 "name": "BaseBdev2", 00:15:52.171 "uuid": "265ee4fb-195e-57aa-b284-aebb9946c303", 00:15:52.171 "is_configured": true, 00:15:52.171 "data_offset": 2048, 00:15:52.171 "data_size": 63488 00:15:52.171 }, 00:15:52.171 { 00:15:52.171 "name": "BaseBdev3", 00:15:52.171 "uuid": "5e9f5dab-9c0f-5d7d-8863-8f5394d37a31", 00:15:52.171 "is_configured": true, 00:15:52.171 "data_offset": 2048, 00:15:52.171 "data_size": 63488 00:15:52.171 } 00:15:52.171 ] 00:15:52.171 }' 00:15:52.171 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.171 17:28:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:52.740 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:52.740 17:28:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:53.000 [2024-07-15 17:28:04.048920] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28842e0 00:15:53.942 17:28:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.942 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:54.202 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.202 "name": "raid_bdev1", 00:15:54.202 "uuid": "cde7e700-d505-4a28-92dc-c8b00c930792", 00:15:54.202 "strip_size_kb": 0, 00:15:54.202 "state": "online", 00:15:54.202 "raid_level": "raid1", 00:15:54.202 "superblock": true, 00:15:54.202 "num_base_bdevs": 3, 00:15:54.202 "num_base_bdevs_discovered": 3, 00:15:54.202 "num_base_bdevs_operational": 3, 00:15:54.202 "base_bdevs_list": [ 00:15:54.202 { 00:15:54.202 "name": "BaseBdev1", 00:15:54.202 "uuid": "7b1ae645-1cfc-5ad8-ba7a-eeb18f05b719", 00:15:54.202 "is_configured": true, 00:15:54.202 "data_offset": 2048, 00:15:54.202 "data_size": 63488 00:15:54.202 }, 00:15:54.202 { 00:15:54.202 "name": "BaseBdev2", 00:15:54.202 "uuid": "265ee4fb-195e-57aa-b284-aebb9946c303", 00:15:54.202 "is_configured": true, 00:15:54.202 "data_offset": 2048, 00:15:54.202 "data_size": 63488 00:15:54.202 }, 00:15:54.202 { 00:15:54.202 "name": "BaseBdev3", 00:15:54.202 "uuid": "5e9f5dab-9c0f-5d7d-8863-8f5394d37a31", 00:15:54.202 "is_configured": true, 00:15:54.202 "data_offset": 2048, 00:15:54.202 "data_size": 63488 00:15:54.202 } 00:15:54.202 ] 00:15:54.202 }' 00:15:54.202 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.202 17:28:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.772 17:28:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:55.033 [2024-07-15 17:28:06.070497] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:55.033 [2024-07-15 17:28:06.070524] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:55.033 [2024-07-15 17:28:06.073146] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:55.033 [2024-07-15 17:28:06.073174] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:55.033 [2024-07-15 17:28:06.073250] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:55.033 [2024-07-15 17:28:06.073257] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28800e0 name raid_bdev1, state offline 00:15:55.033 0 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2804948 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2804948 ']' 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2804948 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2804948 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2804948' 00:15:55.033 killing process with pid 2804948 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2804948 00:15:55.033 [2024-07-15 17:28:06.141800] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2804948 00:15:55.033 [2024-07-15 17:28:06.153139] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.mNJkAnrI8T 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:15:55.033 00:15:55.033 real 0m5.911s 00:15:55.033 user 0m9.423s 00:15:55.033 sys 0m0.842s 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:55.033 17:28:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.033 ************************************ 00:15:55.033 END TEST raid_read_error_test 00:15:55.033 ************************************ 00:15:55.033 17:28:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:55.033 17:28:06 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:15:55.033 17:28:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:55.033 17:28:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:55.033 17:28:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:55.294 ************************************ 00:15:55.294 START TEST raid_write_error_test 00:15:55.294 ************************************ 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.FX34RVekrV 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2806063 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2806063 /var/tmp/spdk-raid.sock 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2806063 ']' 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:55.294 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:55.294 17:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.294 [2024-07-15 17:28:06.427222] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:15:55.294 [2024-07-15 17:28:06.427271] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2806063 ] 00:15:55.294 [2024-07-15 17:28:06.513845] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:55.294 [2024-07-15 17:28:06.576309] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:55.554 [2024-07-15 17:28:06.615812] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:55.554 [2024-07-15 17:28:06.615838] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:56.124 17:28:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:56.124 17:28:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:56.124 17:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:56.124 17:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:56.384 BaseBdev1_malloc 00:15:56.384 17:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:56.384 true 00:15:56.384 17:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:56.645 [2024-07-15 17:28:07.826325] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:56.645 [2024-07-15 17:28:07.826359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:56.645 [2024-07-15 17:28:07.826371] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b9bb50 00:15:56.645 [2024-07-15 17:28:07.826377] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:56.645 [2024-07-15 17:28:07.827663] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:56.645 [2024-07-15 17:28:07.827684] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:56.645 BaseBdev1 00:15:56.645 17:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:56.645 17:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:56.906 BaseBdev2_malloc 00:15:56.906 17:28:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:57.166 true 00:15:57.166 17:28:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:57.166 [2024-07-15 17:28:08.401411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:57.166 [2024-07-15 17:28:08.401438] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:57.166 [2024-07-15 17:28:08.401448] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b7fea0 00:15:57.166 [2024-07-15 17:28:08.401454] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:57.166 [2024-07-15 17:28:08.402599] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:57.166 [2024-07-15 17:28:08.402618] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:57.166 BaseBdev2 00:15:57.166 17:28:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:57.166 17:28:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:57.426 BaseBdev3_malloc 00:15:57.426 17:28:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:57.687 true 00:15:57.687 17:28:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:57.687 [2024-07-15 17:28:08.964354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:57.687 [2024-07-15 17:28:08.964382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:57.687 [2024-07-15 17:28:08.964394] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b83fb0 00:15:57.687 [2024-07-15 17:28:08.964400] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:57.687 [2024-07-15 17:28:08.965550] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:57.687 [2024-07-15 17:28:08.965569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:57.687 BaseBdev3 00:15:57.687 17:28:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:57.946 [2024-07-15 17:28:09.148837] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:57.946 [2024-07-15 17:28:09.149815] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:57.946 [2024-07-15 17:28:09.149866] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:57.946 [2024-07-15 17:28:09.150019] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b850e0 00:15:57.946 [2024-07-15 17:28:09.150027] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:57.946 [2024-07-15 17:28:09.150159] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b829b0 00:15:57.946 [2024-07-15 17:28:09.150276] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b850e0 00:15:57.946 [2024-07-15 17:28:09.150282] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b850e0 00:15:57.946 [2024-07-15 17:28:09.150355] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.946 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:58.207 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.207 "name": "raid_bdev1", 00:15:58.207 "uuid": "1be4dc21-f85b-4f13-831d-a655557e7945", 00:15:58.207 "strip_size_kb": 0, 00:15:58.207 "state": "online", 00:15:58.207 "raid_level": "raid1", 00:15:58.207 "superblock": true, 00:15:58.207 "num_base_bdevs": 3, 00:15:58.207 "num_base_bdevs_discovered": 3, 00:15:58.207 "num_base_bdevs_operational": 3, 00:15:58.207 "base_bdevs_list": [ 00:15:58.207 { 00:15:58.207 "name": "BaseBdev1", 00:15:58.207 "uuid": "aa0f3125-144b-5632-b60c-cf35452863c0", 00:15:58.207 "is_configured": true, 00:15:58.207 "data_offset": 2048, 00:15:58.207 "data_size": 63488 00:15:58.207 }, 00:15:58.207 { 00:15:58.207 "name": "BaseBdev2", 00:15:58.207 "uuid": "e21eda8f-84f0-552d-86af-eaba483631eb", 00:15:58.207 "is_configured": true, 00:15:58.207 "data_offset": 2048, 00:15:58.207 "data_size": 63488 00:15:58.207 }, 00:15:58.207 { 00:15:58.207 "name": "BaseBdev3", 00:15:58.207 "uuid": "5b6d2a04-11fc-57f2-8832-a78665de4a1b", 00:15:58.207 "is_configured": true, 00:15:58.207 "data_offset": 2048, 00:15:58.207 "data_size": 63488 00:15:58.207 } 00:15:58.207 ] 00:15:58.207 }' 00:15:58.207 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.207 17:28:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:58.776 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:58.776 17:28:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:58.776 [2024-07-15 17:28:10.003234] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b892e0 00:15:59.718 17:28:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:59.977 [2024-07-15 17:28:11.095315] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:15:59.977 [2024-07-15 17:28:11.095361] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:59.977 [2024-07-15 17:28:11.095536] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1b892e0 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.977 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:00.236 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.236 "name": "raid_bdev1", 00:16:00.236 "uuid": "1be4dc21-f85b-4f13-831d-a655557e7945", 00:16:00.236 "strip_size_kb": 0, 00:16:00.236 "state": "online", 00:16:00.236 "raid_level": "raid1", 00:16:00.236 "superblock": true, 00:16:00.236 "num_base_bdevs": 3, 00:16:00.236 "num_base_bdevs_discovered": 2, 00:16:00.236 "num_base_bdevs_operational": 2, 00:16:00.236 "base_bdevs_list": [ 00:16:00.236 { 00:16:00.236 "name": null, 00:16:00.236 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.236 "is_configured": false, 00:16:00.236 "data_offset": 2048, 00:16:00.236 "data_size": 63488 00:16:00.236 }, 00:16:00.236 { 00:16:00.236 "name": "BaseBdev2", 00:16:00.236 "uuid": "e21eda8f-84f0-552d-86af-eaba483631eb", 00:16:00.236 "is_configured": true, 00:16:00.236 "data_offset": 2048, 00:16:00.236 "data_size": 63488 00:16:00.236 }, 00:16:00.236 { 00:16:00.236 "name": "BaseBdev3", 00:16:00.236 "uuid": "5b6d2a04-11fc-57f2-8832-a78665de4a1b", 00:16:00.236 "is_configured": true, 00:16:00.236 "data_offset": 2048, 00:16:00.236 "data_size": 63488 00:16:00.236 } 00:16:00.236 ] 00:16:00.236 }' 00:16:00.236 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.236 17:28:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.806 17:28:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:00.806 [2024-07-15 17:28:12.047178] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:00.806 [2024-07-15 17:28:12.047204] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:00.806 [2024-07-15 17:28:12.049761] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:00.806 [2024-07-15 17:28:12.049786] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:00.806 [2024-07-15 17:28:12.049840] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:00.806 [2024-07-15 17:28:12.049846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b850e0 name raid_bdev1, state offline 00:16:00.806 0 00:16:00.806 17:28:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2806063 00:16:00.806 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2806063 ']' 00:16:00.806 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2806063 00:16:00.806 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:00.806 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:00.806 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2806063 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2806063' 00:16:01.066 killing process with pid 2806063 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2806063 00:16:01.066 [2024-07-15 17:28:12.133037] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2806063 00:16:01.066 [2024-07-15 17:28:12.144018] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.FX34RVekrV 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:16:01.066 00:16:01.066 real 0m5.917s 00:16:01.066 user 0m9.474s 00:16:01.066 sys 0m0.799s 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:01.066 17:28:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.066 ************************************ 00:16:01.066 END TEST raid_write_error_test 00:16:01.066 ************************************ 00:16:01.066 17:28:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:01.066 17:28:12 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:16:01.066 17:28:12 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:01.066 17:28:12 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:16:01.066 17:28:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:01.066 17:28:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:01.066 17:28:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:01.066 ************************************ 00:16:01.066 START TEST raid_state_function_test 00:16:01.066 ************************************ 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:01.066 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2807077 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2807077' 00:16:01.326 Process raid pid: 2807077 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2807077 /var/tmp/spdk-raid.sock 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2807077 ']' 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:01.326 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:01.326 17:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.326 [2024-07-15 17:28:12.418555] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:16:01.326 [2024-07-15 17:28:12.418614] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:01.326 [2024-07-15 17:28:12.512192] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:01.326 [2024-07-15 17:28:12.586748] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:01.586 [2024-07-15 17:28:12.634135] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:01.586 [2024-07-15 17:28:12.634160] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:02.155 17:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:02.155 17:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:02.155 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:02.155 [2024-07-15 17:28:13.438679] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:02.155 [2024-07-15 17:28:13.438714] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:02.155 [2024-07-15 17:28:13.438721] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:02.155 [2024-07-15 17:28:13.438727] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:02.155 [2024-07-15 17:28:13.438731] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:02.155 [2024-07-15 17:28:13.438737] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:02.155 [2024-07-15 17:28:13.438741] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:02.155 [2024-07-15 17:28:13.438747] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.414 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.414 "name": "Existed_Raid", 00:16:02.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.414 "strip_size_kb": 64, 00:16:02.414 "state": "configuring", 00:16:02.414 "raid_level": "raid0", 00:16:02.414 "superblock": false, 00:16:02.414 "num_base_bdevs": 4, 00:16:02.414 "num_base_bdevs_discovered": 0, 00:16:02.414 "num_base_bdevs_operational": 4, 00:16:02.414 "base_bdevs_list": [ 00:16:02.414 { 00:16:02.414 "name": "BaseBdev1", 00:16:02.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.414 "is_configured": false, 00:16:02.414 "data_offset": 0, 00:16:02.414 "data_size": 0 00:16:02.414 }, 00:16:02.414 { 00:16:02.414 "name": "BaseBdev2", 00:16:02.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.414 "is_configured": false, 00:16:02.414 "data_offset": 0, 00:16:02.414 "data_size": 0 00:16:02.414 }, 00:16:02.414 { 00:16:02.414 "name": "BaseBdev3", 00:16:02.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.414 "is_configured": false, 00:16:02.414 "data_offset": 0, 00:16:02.414 "data_size": 0 00:16:02.414 }, 00:16:02.414 { 00:16:02.414 "name": "BaseBdev4", 00:16:02.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.414 "is_configured": false, 00:16:02.414 "data_offset": 0, 00:16:02.414 "data_size": 0 00:16:02.414 } 00:16:02.415 ] 00:16:02.415 }' 00:16:02.415 17:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.415 17:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.986 17:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:03.246 [2024-07-15 17:28:14.380941] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:03.246 [2024-07-15 17:28:14.380958] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbd86f0 name Existed_Raid, state configuring 00:16:03.246 17:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:03.507 [2024-07-15 17:28:14.569443] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:03.507 [2024-07-15 17:28:14.569461] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:03.507 [2024-07-15 17:28:14.569467] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:03.507 [2024-07-15 17:28:14.569472] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:03.507 [2024-07-15 17:28:14.569477] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:03.507 [2024-07-15 17:28:14.569482] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:03.507 [2024-07-15 17:28:14.569487] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:03.507 [2024-07-15 17:28:14.569492] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:03.507 17:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:03.507 [2024-07-15 17:28:14.764707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:03.507 BaseBdev1 00:16:03.507 17:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:03.507 17:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:03.507 17:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:03.507 17:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:03.507 17:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:03.507 17:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:03.507 17:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:03.767 17:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:04.026 [ 00:16:04.026 { 00:16:04.026 "name": "BaseBdev1", 00:16:04.026 "aliases": [ 00:16:04.026 "92618288-94cc-4aa4-9f07-824414b0f43e" 00:16:04.026 ], 00:16:04.026 "product_name": "Malloc disk", 00:16:04.026 "block_size": 512, 00:16:04.026 "num_blocks": 65536, 00:16:04.026 "uuid": "92618288-94cc-4aa4-9f07-824414b0f43e", 00:16:04.026 "assigned_rate_limits": { 00:16:04.026 "rw_ios_per_sec": 0, 00:16:04.026 "rw_mbytes_per_sec": 0, 00:16:04.026 "r_mbytes_per_sec": 0, 00:16:04.026 "w_mbytes_per_sec": 0 00:16:04.026 }, 00:16:04.026 "claimed": true, 00:16:04.026 "claim_type": "exclusive_write", 00:16:04.026 "zoned": false, 00:16:04.026 "supported_io_types": { 00:16:04.026 "read": true, 00:16:04.026 "write": true, 00:16:04.026 "unmap": true, 00:16:04.026 "flush": true, 00:16:04.026 "reset": true, 00:16:04.026 "nvme_admin": false, 00:16:04.026 "nvme_io": false, 00:16:04.026 "nvme_io_md": false, 00:16:04.026 "write_zeroes": true, 00:16:04.026 "zcopy": true, 00:16:04.026 "get_zone_info": false, 00:16:04.026 "zone_management": false, 00:16:04.026 "zone_append": false, 00:16:04.026 "compare": false, 00:16:04.026 "compare_and_write": false, 00:16:04.026 "abort": true, 00:16:04.026 "seek_hole": false, 00:16:04.026 "seek_data": false, 00:16:04.026 "copy": true, 00:16:04.026 "nvme_iov_md": false 00:16:04.026 }, 00:16:04.026 "memory_domains": [ 00:16:04.026 { 00:16:04.026 "dma_device_id": "system", 00:16:04.026 "dma_device_type": 1 00:16:04.026 }, 00:16:04.026 { 00:16:04.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.026 "dma_device_type": 2 00:16:04.026 } 00:16:04.026 ], 00:16:04.026 "driver_specific": {} 00:16:04.026 } 00:16:04.026 ] 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.026 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.285 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.285 "name": "Existed_Raid", 00:16:04.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.285 "strip_size_kb": 64, 00:16:04.285 "state": "configuring", 00:16:04.285 "raid_level": "raid0", 00:16:04.285 "superblock": false, 00:16:04.285 "num_base_bdevs": 4, 00:16:04.285 "num_base_bdevs_discovered": 1, 00:16:04.285 "num_base_bdevs_operational": 4, 00:16:04.285 "base_bdevs_list": [ 00:16:04.285 { 00:16:04.285 "name": "BaseBdev1", 00:16:04.285 "uuid": "92618288-94cc-4aa4-9f07-824414b0f43e", 00:16:04.285 "is_configured": true, 00:16:04.285 "data_offset": 0, 00:16:04.285 "data_size": 65536 00:16:04.285 }, 00:16:04.285 { 00:16:04.285 "name": "BaseBdev2", 00:16:04.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.285 "is_configured": false, 00:16:04.285 "data_offset": 0, 00:16:04.285 "data_size": 0 00:16:04.285 }, 00:16:04.285 { 00:16:04.285 "name": "BaseBdev3", 00:16:04.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.285 "is_configured": false, 00:16:04.285 "data_offset": 0, 00:16:04.285 "data_size": 0 00:16:04.285 }, 00:16:04.285 { 00:16:04.285 "name": "BaseBdev4", 00:16:04.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.285 "is_configured": false, 00:16:04.285 "data_offset": 0, 00:16:04.285 "data_size": 0 00:16:04.285 } 00:16:04.285 ] 00:16:04.285 }' 00:16:04.285 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.285 17:28:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.884 17:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:04.884 [2024-07-15 17:28:16.072015] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:04.884 [2024-07-15 17:28:16.072045] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbd7f60 name Existed_Raid, state configuring 00:16:04.884 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:05.143 [2024-07-15 17:28:16.244480] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:05.143 [2024-07-15 17:28:16.245590] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:05.143 [2024-07-15 17:28:16.245614] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:05.143 [2024-07-15 17:28:16.245621] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:05.143 [2024-07-15 17:28:16.245626] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:05.143 [2024-07-15 17:28:16.245631] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:05.143 [2024-07-15 17:28:16.245637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.143 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.403 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.403 "name": "Existed_Raid", 00:16:05.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.403 "strip_size_kb": 64, 00:16:05.403 "state": "configuring", 00:16:05.403 "raid_level": "raid0", 00:16:05.403 "superblock": false, 00:16:05.403 "num_base_bdevs": 4, 00:16:05.403 "num_base_bdevs_discovered": 1, 00:16:05.403 "num_base_bdevs_operational": 4, 00:16:05.403 "base_bdevs_list": [ 00:16:05.403 { 00:16:05.403 "name": "BaseBdev1", 00:16:05.403 "uuid": "92618288-94cc-4aa4-9f07-824414b0f43e", 00:16:05.403 "is_configured": true, 00:16:05.403 "data_offset": 0, 00:16:05.403 "data_size": 65536 00:16:05.403 }, 00:16:05.403 { 00:16:05.403 "name": "BaseBdev2", 00:16:05.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.403 "is_configured": false, 00:16:05.403 "data_offset": 0, 00:16:05.403 "data_size": 0 00:16:05.403 }, 00:16:05.403 { 00:16:05.403 "name": "BaseBdev3", 00:16:05.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.403 "is_configured": false, 00:16:05.403 "data_offset": 0, 00:16:05.403 "data_size": 0 00:16:05.403 }, 00:16:05.403 { 00:16:05.403 "name": "BaseBdev4", 00:16:05.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.403 "is_configured": false, 00:16:05.403 "data_offset": 0, 00:16:05.403 "data_size": 0 00:16:05.403 } 00:16:05.403 ] 00:16:05.403 }' 00:16:05.403 17:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.403 17:28:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:05.973 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:05.973 [2024-07-15 17:28:17.179808] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:05.973 BaseBdev2 00:16:05.973 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:05.973 17:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:05.973 17:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:05.973 17:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:05.973 17:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:05.973 17:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:05.973 17:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:06.232 17:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:06.492 [ 00:16:06.492 { 00:16:06.492 "name": "BaseBdev2", 00:16:06.492 "aliases": [ 00:16:06.492 "8a61ac7b-ebbc-4489-ae14-0ac7b73f71fa" 00:16:06.492 ], 00:16:06.492 "product_name": "Malloc disk", 00:16:06.492 "block_size": 512, 00:16:06.492 "num_blocks": 65536, 00:16:06.492 "uuid": "8a61ac7b-ebbc-4489-ae14-0ac7b73f71fa", 00:16:06.492 "assigned_rate_limits": { 00:16:06.492 "rw_ios_per_sec": 0, 00:16:06.492 "rw_mbytes_per_sec": 0, 00:16:06.492 "r_mbytes_per_sec": 0, 00:16:06.492 "w_mbytes_per_sec": 0 00:16:06.492 }, 00:16:06.492 "claimed": true, 00:16:06.492 "claim_type": "exclusive_write", 00:16:06.492 "zoned": false, 00:16:06.492 "supported_io_types": { 00:16:06.492 "read": true, 00:16:06.492 "write": true, 00:16:06.492 "unmap": true, 00:16:06.492 "flush": true, 00:16:06.492 "reset": true, 00:16:06.492 "nvme_admin": false, 00:16:06.492 "nvme_io": false, 00:16:06.492 "nvme_io_md": false, 00:16:06.492 "write_zeroes": true, 00:16:06.492 "zcopy": true, 00:16:06.492 "get_zone_info": false, 00:16:06.492 "zone_management": false, 00:16:06.492 "zone_append": false, 00:16:06.492 "compare": false, 00:16:06.492 "compare_and_write": false, 00:16:06.492 "abort": true, 00:16:06.492 "seek_hole": false, 00:16:06.492 "seek_data": false, 00:16:06.492 "copy": true, 00:16:06.492 "nvme_iov_md": false 00:16:06.492 }, 00:16:06.492 "memory_domains": [ 00:16:06.492 { 00:16:06.492 "dma_device_id": "system", 00:16:06.492 "dma_device_type": 1 00:16:06.492 }, 00:16:06.492 { 00:16:06.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.492 "dma_device_type": 2 00:16:06.492 } 00:16:06.492 ], 00:16:06.492 "driver_specific": {} 00:16:06.492 } 00:16:06.492 ] 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.492 "name": "Existed_Raid", 00:16:06.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.492 "strip_size_kb": 64, 00:16:06.492 "state": "configuring", 00:16:06.492 "raid_level": "raid0", 00:16:06.492 "superblock": false, 00:16:06.492 "num_base_bdevs": 4, 00:16:06.492 "num_base_bdevs_discovered": 2, 00:16:06.492 "num_base_bdevs_operational": 4, 00:16:06.492 "base_bdevs_list": [ 00:16:06.492 { 00:16:06.492 "name": "BaseBdev1", 00:16:06.492 "uuid": "92618288-94cc-4aa4-9f07-824414b0f43e", 00:16:06.492 "is_configured": true, 00:16:06.492 "data_offset": 0, 00:16:06.492 "data_size": 65536 00:16:06.492 }, 00:16:06.492 { 00:16:06.492 "name": "BaseBdev2", 00:16:06.492 "uuid": "8a61ac7b-ebbc-4489-ae14-0ac7b73f71fa", 00:16:06.492 "is_configured": true, 00:16:06.492 "data_offset": 0, 00:16:06.492 "data_size": 65536 00:16:06.492 }, 00:16:06.492 { 00:16:06.492 "name": "BaseBdev3", 00:16:06.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.492 "is_configured": false, 00:16:06.492 "data_offset": 0, 00:16:06.492 "data_size": 0 00:16:06.492 }, 00:16:06.492 { 00:16:06.492 "name": "BaseBdev4", 00:16:06.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.492 "is_configured": false, 00:16:06.492 "data_offset": 0, 00:16:06.492 "data_size": 0 00:16:06.492 } 00:16:06.492 ] 00:16:06.492 }' 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.492 17:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.061 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:07.320 [2024-07-15 17:28:18.484158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:07.320 BaseBdev3 00:16:07.320 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:07.320 17:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:07.320 17:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:07.320 17:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:07.320 17:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:07.320 17:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:07.320 17:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:07.604 [ 00:16:07.604 { 00:16:07.604 "name": "BaseBdev3", 00:16:07.604 "aliases": [ 00:16:07.604 "8f87697a-3e4f-4502-bee5-68605a509aa7" 00:16:07.604 ], 00:16:07.604 "product_name": "Malloc disk", 00:16:07.604 "block_size": 512, 00:16:07.604 "num_blocks": 65536, 00:16:07.604 "uuid": "8f87697a-3e4f-4502-bee5-68605a509aa7", 00:16:07.604 "assigned_rate_limits": { 00:16:07.604 "rw_ios_per_sec": 0, 00:16:07.604 "rw_mbytes_per_sec": 0, 00:16:07.604 "r_mbytes_per_sec": 0, 00:16:07.604 "w_mbytes_per_sec": 0 00:16:07.604 }, 00:16:07.604 "claimed": true, 00:16:07.604 "claim_type": "exclusive_write", 00:16:07.604 "zoned": false, 00:16:07.604 "supported_io_types": { 00:16:07.604 "read": true, 00:16:07.604 "write": true, 00:16:07.604 "unmap": true, 00:16:07.604 "flush": true, 00:16:07.604 "reset": true, 00:16:07.604 "nvme_admin": false, 00:16:07.604 "nvme_io": false, 00:16:07.604 "nvme_io_md": false, 00:16:07.604 "write_zeroes": true, 00:16:07.604 "zcopy": true, 00:16:07.604 "get_zone_info": false, 00:16:07.604 "zone_management": false, 00:16:07.604 "zone_append": false, 00:16:07.604 "compare": false, 00:16:07.604 "compare_and_write": false, 00:16:07.604 "abort": true, 00:16:07.604 "seek_hole": false, 00:16:07.604 "seek_data": false, 00:16:07.604 "copy": true, 00:16:07.604 "nvme_iov_md": false 00:16:07.604 }, 00:16:07.604 "memory_domains": [ 00:16:07.604 { 00:16:07.604 "dma_device_id": "system", 00:16:07.604 "dma_device_type": 1 00:16:07.604 }, 00:16:07.604 { 00:16:07.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.604 "dma_device_type": 2 00:16:07.604 } 00:16:07.604 ], 00:16:07.604 "driver_specific": {} 00:16:07.604 } 00:16:07.604 ] 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.604 17:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.863 17:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.863 "name": "Existed_Raid", 00:16:07.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.864 "strip_size_kb": 64, 00:16:07.864 "state": "configuring", 00:16:07.864 "raid_level": "raid0", 00:16:07.864 "superblock": false, 00:16:07.864 "num_base_bdevs": 4, 00:16:07.864 "num_base_bdevs_discovered": 3, 00:16:07.864 "num_base_bdevs_operational": 4, 00:16:07.864 "base_bdevs_list": [ 00:16:07.864 { 00:16:07.864 "name": "BaseBdev1", 00:16:07.864 "uuid": "92618288-94cc-4aa4-9f07-824414b0f43e", 00:16:07.864 "is_configured": true, 00:16:07.864 "data_offset": 0, 00:16:07.864 "data_size": 65536 00:16:07.864 }, 00:16:07.864 { 00:16:07.864 "name": "BaseBdev2", 00:16:07.864 "uuid": "8a61ac7b-ebbc-4489-ae14-0ac7b73f71fa", 00:16:07.864 "is_configured": true, 00:16:07.864 "data_offset": 0, 00:16:07.864 "data_size": 65536 00:16:07.864 }, 00:16:07.864 { 00:16:07.864 "name": "BaseBdev3", 00:16:07.864 "uuid": "8f87697a-3e4f-4502-bee5-68605a509aa7", 00:16:07.864 "is_configured": true, 00:16:07.864 "data_offset": 0, 00:16:07.864 "data_size": 65536 00:16:07.864 }, 00:16:07.864 { 00:16:07.864 "name": "BaseBdev4", 00:16:07.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.864 "is_configured": false, 00:16:07.864 "data_offset": 0, 00:16:07.864 "data_size": 0 00:16:07.864 } 00:16:07.864 ] 00:16:07.864 }' 00:16:07.864 17:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.864 17:28:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:08.430 17:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:08.688 [2024-07-15 17:28:19.800482] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:08.688 [2024-07-15 17:28:19.800510] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbd8fc0 00:16:08.688 [2024-07-15 17:28:19.800514] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:08.688 [2024-07-15 17:28:19.800676] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbd8c00 00:16:08.688 [2024-07-15 17:28:19.800780] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbd8fc0 00:16:08.688 [2024-07-15 17:28:19.800786] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbd8fc0 00:16:08.688 [2024-07-15 17:28:19.800909] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:08.688 BaseBdev4 00:16:08.688 17:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:08.688 17:28:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:08.689 17:28:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:08.689 17:28:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:08.689 17:28:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:08.689 17:28:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:08.689 17:28:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:08.948 [ 00:16:08.948 { 00:16:08.948 "name": "BaseBdev4", 00:16:08.948 "aliases": [ 00:16:08.948 "fd90e514-2429-45a9-ba4e-829e22d88d56" 00:16:08.948 ], 00:16:08.948 "product_name": "Malloc disk", 00:16:08.948 "block_size": 512, 00:16:08.948 "num_blocks": 65536, 00:16:08.948 "uuid": "fd90e514-2429-45a9-ba4e-829e22d88d56", 00:16:08.948 "assigned_rate_limits": { 00:16:08.948 "rw_ios_per_sec": 0, 00:16:08.948 "rw_mbytes_per_sec": 0, 00:16:08.948 "r_mbytes_per_sec": 0, 00:16:08.948 "w_mbytes_per_sec": 0 00:16:08.948 }, 00:16:08.948 "claimed": true, 00:16:08.948 "claim_type": "exclusive_write", 00:16:08.948 "zoned": false, 00:16:08.948 "supported_io_types": { 00:16:08.948 "read": true, 00:16:08.948 "write": true, 00:16:08.948 "unmap": true, 00:16:08.948 "flush": true, 00:16:08.948 "reset": true, 00:16:08.948 "nvme_admin": false, 00:16:08.948 "nvme_io": false, 00:16:08.948 "nvme_io_md": false, 00:16:08.948 "write_zeroes": true, 00:16:08.948 "zcopy": true, 00:16:08.948 "get_zone_info": false, 00:16:08.948 "zone_management": false, 00:16:08.948 "zone_append": false, 00:16:08.948 "compare": false, 00:16:08.948 "compare_and_write": false, 00:16:08.948 "abort": true, 00:16:08.948 "seek_hole": false, 00:16:08.948 "seek_data": false, 00:16:08.948 "copy": true, 00:16:08.948 "nvme_iov_md": false 00:16:08.948 }, 00:16:08.948 "memory_domains": [ 00:16:08.948 { 00:16:08.948 "dma_device_id": "system", 00:16:08.948 "dma_device_type": 1 00:16:08.948 }, 00:16:08.948 { 00:16:08.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.948 "dma_device_type": 2 00:16:08.948 } 00:16:08.948 ], 00:16:08.948 "driver_specific": {} 00:16:08.948 } 00:16:08.948 ] 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.948 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.238 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.238 "name": "Existed_Raid", 00:16:09.238 "uuid": "9bf6212c-8a58-483c-ae38-08f9bb784d29", 00:16:09.238 "strip_size_kb": 64, 00:16:09.238 "state": "online", 00:16:09.238 "raid_level": "raid0", 00:16:09.238 "superblock": false, 00:16:09.238 "num_base_bdevs": 4, 00:16:09.238 "num_base_bdevs_discovered": 4, 00:16:09.238 "num_base_bdevs_operational": 4, 00:16:09.238 "base_bdevs_list": [ 00:16:09.238 { 00:16:09.238 "name": "BaseBdev1", 00:16:09.238 "uuid": "92618288-94cc-4aa4-9f07-824414b0f43e", 00:16:09.238 "is_configured": true, 00:16:09.238 "data_offset": 0, 00:16:09.238 "data_size": 65536 00:16:09.238 }, 00:16:09.238 { 00:16:09.238 "name": "BaseBdev2", 00:16:09.238 "uuid": "8a61ac7b-ebbc-4489-ae14-0ac7b73f71fa", 00:16:09.238 "is_configured": true, 00:16:09.238 "data_offset": 0, 00:16:09.238 "data_size": 65536 00:16:09.238 }, 00:16:09.238 { 00:16:09.238 "name": "BaseBdev3", 00:16:09.238 "uuid": "8f87697a-3e4f-4502-bee5-68605a509aa7", 00:16:09.238 "is_configured": true, 00:16:09.238 "data_offset": 0, 00:16:09.238 "data_size": 65536 00:16:09.238 }, 00:16:09.238 { 00:16:09.238 "name": "BaseBdev4", 00:16:09.238 "uuid": "fd90e514-2429-45a9-ba4e-829e22d88d56", 00:16:09.238 "is_configured": true, 00:16:09.238 "data_offset": 0, 00:16:09.238 "data_size": 65536 00:16:09.238 } 00:16:09.238 ] 00:16:09.238 }' 00:16:09.238 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.238 17:28:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.807 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:09.807 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:09.807 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:09.807 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:09.807 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:09.807 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:09.807 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:09.807 17:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:10.066 [2024-07-15 17:28:21.120069] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:10.067 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:10.067 "name": "Existed_Raid", 00:16:10.067 "aliases": [ 00:16:10.067 "9bf6212c-8a58-483c-ae38-08f9bb784d29" 00:16:10.067 ], 00:16:10.067 "product_name": "Raid Volume", 00:16:10.067 "block_size": 512, 00:16:10.067 "num_blocks": 262144, 00:16:10.067 "uuid": "9bf6212c-8a58-483c-ae38-08f9bb784d29", 00:16:10.067 "assigned_rate_limits": { 00:16:10.067 "rw_ios_per_sec": 0, 00:16:10.067 "rw_mbytes_per_sec": 0, 00:16:10.067 "r_mbytes_per_sec": 0, 00:16:10.067 "w_mbytes_per_sec": 0 00:16:10.067 }, 00:16:10.067 "claimed": false, 00:16:10.067 "zoned": false, 00:16:10.067 "supported_io_types": { 00:16:10.067 "read": true, 00:16:10.067 "write": true, 00:16:10.067 "unmap": true, 00:16:10.067 "flush": true, 00:16:10.067 "reset": true, 00:16:10.067 "nvme_admin": false, 00:16:10.067 "nvme_io": false, 00:16:10.067 "nvme_io_md": false, 00:16:10.067 "write_zeroes": true, 00:16:10.067 "zcopy": false, 00:16:10.067 "get_zone_info": false, 00:16:10.067 "zone_management": false, 00:16:10.067 "zone_append": false, 00:16:10.067 "compare": false, 00:16:10.067 "compare_and_write": false, 00:16:10.067 "abort": false, 00:16:10.067 "seek_hole": false, 00:16:10.067 "seek_data": false, 00:16:10.067 "copy": false, 00:16:10.067 "nvme_iov_md": false 00:16:10.067 }, 00:16:10.067 "memory_domains": [ 00:16:10.067 { 00:16:10.067 "dma_device_id": "system", 00:16:10.067 "dma_device_type": 1 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.067 "dma_device_type": 2 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "dma_device_id": "system", 00:16:10.067 "dma_device_type": 1 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.067 "dma_device_type": 2 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "dma_device_id": "system", 00:16:10.067 "dma_device_type": 1 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.067 "dma_device_type": 2 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "dma_device_id": "system", 00:16:10.067 "dma_device_type": 1 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.067 "dma_device_type": 2 00:16:10.067 } 00:16:10.067 ], 00:16:10.067 "driver_specific": { 00:16:10.067 "raid": { 00:16:10.067 "uuid": "9bf6212c-8a58-483c-ae38-08f9bb784d29", 00:16:10.067 "strip_size_kb": 64, 00:16:10.067 "state": "online", 00:16:10.067 "raid_level": "raid0", 00:16:10.067 "superblock": false, 00:16:10.067 "num_base_bdevs": 4, 00:16:10.067 "num_base_bdevs_discovered": 4, 00:16:10.067 "num_base_bdevs_operational": 4, 00:16:10.067 "base_bdevs_list": [ 00:16:10.067 { 00:16:10.067 "name": "BaseBdev1", 00:16:10.067 "uuid": "92618288-94cc-4aa4-9f07-824414b0f43e", 00:16:10.067 "is_configured": true, 00:16:10.067 "data_offset": 0, 00:16:10.067 "data_size": 65536 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "name": "BaseBdev2", 00:16:10.067 "uuid": "8a61ac7b-ebbc-4489-ae14-0ac7b73f71fa", 00:16:10.067 "is_configured": true, 00:16:10.067 "data_offset": 0, 00:16:10.067 "data_size": 65536 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "name": "BaseBdev3", 00:16:10.067 "uuid": "8f87697a-3e4f-4502-bee5-68605a509aa7", 00:16:10.067 "is_configured": true, 00:16:10.067 "data_offset": 0, 00:16:10.067 "data_size": 65536 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "name": "BaseBdev4", 00:16:10.067 "uuid": "fd90e514-2429-45a9-ba4e-829e22d88d56", 00:16:10.067 "is_configured": true, 00:16:10.067 "data_offset": 0, 00:16:10.067 "data_size": 65536 00:16:10.067 } 00:16:10.067 ] 00:16:10.067 } 00:16:10.067 } 00:16:10.067 }' 00:16:10.067 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:10.067 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:10.067 BaseBdev2 00:16:10.067 BaseBdev3 00:16:10.067 BaseBdev4' 00:16:10.067 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:10.067 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:10.067 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:10.067 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:10.067 "name": "BaseBdev1", 00:16:10.067 "aliases": [ 00:16:10.067 "92618288-94cc-4aa4-9f07-824414b0f43e" 00:16:10.067 ], 00:16:10.067 "product_name": "Malloc disk", 00:16:10.067 "block_size": 512, 00:16:10.067 "num_blocks": 65536, 00:16:10.067 "uuid": "92618288-94cc-4aa4-9f07-824414b0f43e", 00:16:10.067 "assigned_rate_limits": { 00:16:10.067 "rw_ios_per_sec": 0, 00:16:10.067 "rw_mbytes_per_sec": 0, 00:16:10.067 "r_mbytes_per_sec": 0, 00:16:10.067 "w_mbytes_per_sec": 0 00:16:10.067 }, 00:16:10.067 "claimed": true, 00:16:10.067 "claim_type": "exclusive_write", 00:16:10.067 "zoned": false, 00:16:10.067 "supported_io_types": { 00:16:10.067 "read": true, 00:16:10.067 "write": true, 00:16:10.067 "unmap": true, 00:16:10.067 "flush": true, 00:16:10.067 "reset": true, 00:16:10.067 "nvme_admin": false, 00:16:10.067 "nvme_io": false, 00:16:10.067 "nvme_io_md": false, 00:16:10.067 "write_zeroes": true, 00:16:10.067 "zcopy": true, 00:16:10.067 "get_zone_info": false, 00:16:10.067 "zone_management": false, 00:16:10.067 "zone_append": false, 00:16:10.067 "compare": false, 00:16:10.067 "compare_and_write": false, 00:16:10.067 "abort": true, 00:16:10.067 "seek_hole": false, 00:16:10.067 "seek_data": false, 00:16:10.067 "copy": true, 00:16:10.067 "nvme_iov_md": false 00:16:10.067 }, 00:16:10.067 "memory_domains": [ 00:16:10.067 { 00:16:10.067 "dma_device_id": "system", 00:16:10.067 "dma_device_type": 1 00:16:10.067 }, 00:16:10.067 { 00:16:10.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.067 "dma_device_type": 2 00:16:10.067 } 00:16:10.067 ], 00:16:10.067 "driver_specific": {} 00:16:10.067 }' 00:16:10.067 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.326 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.326 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:10.326 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.326 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.326 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:10.326 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.326 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.585 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:10.585 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.585 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.585 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:10.585 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:10.585 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:10.585 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:10.845 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:10.845 "name": "BaseBdev2", 00:16:10.845 "aliases": [ 00:16:10.845 "8a61ac7b-ebbc-4489-ae14-0ac7b73f71fa" 00:16:10.845 ], 00:16:10.845 "product_name": "Malloc disk", 00:16:10.845 "block_size": 512, 00:16:10.845 "num_blocks": 65536, 00:16:10.845 "uuid": "8a61ac7b-ebbc-4489-ae14-0ac7b73f71fa", 00:16:10.845 "assigned_rate_limits": { 00:16:10.845 "rw_ios_per_sec": 0, 00:16:10.845 "rw_mbytes_per_sec": 0, 00:16:10.845 "r_mbytes_per_sec": 0, 00:16:10.845 "w_mbytes_per_sec": 0 00:16:10.845 }, 00:16:10.845 "claimed": true, 00:16:10.845 "claim_type": "exclusive_write", 00:16:10.845 "zoned": false, 00:16:10.845 "supported_io_types": { 00:16:10.845 "read": true, 00:16:10.845 "write": true, 00:16:10.845 "unmap": true, 00:16:10.845 "flush": true, 00:16:10.845 "reset": true, 00:16:10.845 "nvme_admin": false, 00:16:10.845 "nvme_io": false, 00:16:10.845 "nvme_io_md": false, 00:16:10.845 "write_zeroes": true, 00:16:10.845 "zcopy": true, 00:16:10.845 "get_zone_info": false, 00:16:10.845 "zone_management": false, 00:16:10.845 "zone_append": false, 00:16:10.845 "compare": false, 00:16:10.845 "compare_and_write": false, 00:16:10.845 "abort": true, 00:16:10.845 "seek_hole": false, 00:16:10.845 "seek_data": false, 00:16:10.845 "copy": true, 00:16:10.845 "nvme_iov_md": false 00:16:10.845 }, 00:16:10.845 "memory_domains": [ 00:16:10.845 { 00:16:10.845 "dma_device_id": "system", 00:16:10.845 "dma_device_type": 1 00:16:10.845 }, 00:16:10.845 { 00:16:10.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.845 "dma_device_type": 2 00:16:10.845 } 00:16:10.845 ], 00:16:10.845 "driver_specific": {} 00:16:10.845 }' 00:16:10.845 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.845 17:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.845 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:10.845 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.845 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.845 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:10.845 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.845 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.105 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.105 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.105 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.105 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.105 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.105 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:11.105 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.366 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.366 "name": "BaseBdev3", 00:16:11.366 "aliases": [ 00:16:11.366 "8f87697a-3e4f-4502-bee5-68605a509aa7" 00:16:11.366 ], 00:16:11.366 "product_name": "Malloc disk", 00:16:11.366 "block_size": 512, 00:16:11.366 "num_blocks": 65536, 00:16:11.366 "uuid": "8f87697a-3e4f-4502-bee5-68605a509aa7", 00:16:11.366 "assigned_rate_limits": { 00:16:11.366 "rw_ios_per_sec": 0, 00:16:11.366 "rw_mbytes_per_sec": 0, 00:16:11.366 "r_mbytes_per_sec": 0, 00:16:11.366 "w_mbytes_per_sec": 0 00:16:11.366 }, 00:16:11.366 "claimed": true, 00:16:11.366 "claim_type": "exclusive_write", 00:16:11.366 "zoned": false, 00:16:11.366 "supported_io_types": { 00:16:11.366 "read": true, 00:16:11.366 "write": true, 00:16:11.366 "unmap": true, 00:16:11.366 "flush": true, 00:16:11.366 "reset": true, 00:16:11.366 "nvme_admin": false, 00:16:11.366 "nvme_io": false, 00:16:11.366 "nvme_io_md": false, 00:16:11.366 "write_zeroes": true, 00:16:11.366 "zcopy": true, 00:16:11.366 "get_zone_info": false, 00:16:11.366 "zone_management": false, 00:16:11.366 "zone_append": false, 00:16:11.366 "compare": false, 00:16:11.366 "compare_and_write": false, 00:16:11.366 "abort": true, 00:16:11.366 "seek_hole": false, 00:16:11.366 "seek_data": false, 00:16:11.366 "copy": true, 00:16:11.366 "nvme_iov_md": false 00:16:11.366 }, 00:16:11.366 "memory_domains": [ 00:16:11.366 { 00:16:11.366 "dma_device_id": "system", 00:16:11.366 "dma_device_type": 1 00:16:11.366 }, 00:16:11.366 { 00:16:11.366 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.366 "dma_device_type": 2 00:16:11.366 } 00:16:11.366 ], 00:16:11.366 "driver_specific": {} 00:16:11.366 }' 00:16:11.366 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.366 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.366 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.366 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.366 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.366 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.366 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.366 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.628 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.628 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.628 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.628 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.628 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.628 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:11.628 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.888 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.888 "name": "BaseBdev4", 00:16:11.888 "aliases": [ 00:16:11.888 "fd90e514-2429-45a9-ba4e-829e22d88d56" 00:16:11.888 ], 00:16:11.888 "product_name": "Malloc disk", 00:16:11.888 "block_size": 512, 00:16:11.888 "num_blocks": 65536, 00:16:11.888 "uuid": "fd90e514-2429-45a9-ba4e-829e22d88d56", 00:16:11.888 "assigned_rate_limits": { 00:16:11.888 "rw_ios_per_sec": 0, 00:16:11.888 "rw_mbytes_per_sec": 0, 00:16:11.888 "r_mbytes_per_sec": 0, 00:16:11.888 "w_mbytes_per_sec": 0 00:16:11.888 }, 00:16:11.888 "claimed": true, 00:16:11.888 "claim_type": "exclusive_write", 00:16:11.888 "zoned": false, 00:16:11.888 "supported_io_types": { 00:16:11.888 "read": true, 00:16:11.888 "write": true, 00:16:11.888 "unmap": true, 00:16:11.888 "flush": true, 00:16:11.888 "reset": true, 00:16:11.888 "nvme_admin": false, 00:16:11.888 "nvme_io": false, 00:16:11.888 "nvme_io_md": false, 00:16:11.888 "write_zeroes": true, 00:16:11.888 "zcopy": true, 00:16:11.888 "get_zone_info": false, 00:16:11.888 "zone_management": false, 00:16:11.888 "zone_append": false, 00:16:11.888 "compare": false, 00:16:11.888 "compare_and_write": false, 00:16:11.888 "abort": true, 00:16:11.888 "seek_hole": false, 00:16:11.888 "seek_data": false, 00:16:11.888 "copy": true, 00:16:11.888 "nvme_iov_md": false 00:16:11.888 }, 00:16:11.888 "memory_domains": [ 00:16:11.888 { 00:16:11.888 "dma_device_id": "system", 00:16:11.888 "dma_device_type": 1 00:16:11.888 }, 00:16:11.888 { 00:16:11.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.888 "dma_device_type": 2 00:16:11.888 } 00:16:11.888 ], 00:16:11.888 "driver_specific": {} 00:16:11.888 }' 00:16:11.888 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.888 17:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.888 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.888 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.888 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.888 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.888 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.888 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.888 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.888 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.147 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.147 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.147 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:12.407 [2024-07-15 17:28:23.445741] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:12.407 [2024-07-15 17:28:23.445760] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:12.407 [2024-07-15 17:28:23.445795] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.407 "name": "Existed_Raid", 00:16:12.407 "uuid": "9bf6212c-8a58-483c-ae38-08f9bb784d29", 00:16:12.407 "strip_size_kb": 64, 00:16:12.407 "state": "offline", 00:16:12.407 "raid_level": "raid0", 00:16:12.407 "superblock": false, 00:16:12.407 "num_base_bdevs": 4, 00:16:12.407 "num_base_bdevs_discovered": 3, 00:16:12.407 "num_base_bdevs_operational": 3, 00:16:12.407 "base_bdevs_list": [ 00:16:12.407 { 00:16:12.407 "name": null, 00:16:12.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.407 "is_configured": false, 00:16:12.407 "data_offset": 0, 00:16:12.407 "data_size": 65536 00:16:12.407 }, 00:16:12.407 { 00:16:12.407 "name": "BaseBdev2", 00:16:12.407 "uuid": "8a61ac7b-ebbc-4489-ae14-0ac7b73f71fa", 00:16:12.407 "is_configured": true, 00:16:12.407 "data_offset": 0, 00:16:12.407 "data_size": 65536 00:16:12.407 }, 00:16:12.407 { 00:16:12.407 "name": "BaseBdev3", 00:16:12.407 "uuid": "8f87697a-3e4f-4502-bee5-68605a509aa7", 00:16:12.407 "is_configured": true, 00:16:12.407 "data_offset": 0, 00:16:12.407 "data_size": 65536 00:16:12.407 }, 00:16:12.407 { 00:16:12.407 "name": "BaseBdev4", 00:16:12.407 "uuid": "fd90e514-2429-45a9-ba4e-829e22d88d56", 00:16:12.407 "is_configured": true, 00:16:12.407 "data_offset": 0, 00:16:12.407 "data_size": 65536 00:16:12.407 } 00:16:12.407 ] 00:16:12.407 }' 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.407 17:28:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:12.977 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:12.977 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:12.977 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:12.977 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.239 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:13.239 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:13.239 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:13.499 [2024-07-15 17:28:24.564570] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:13.499 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:13.499 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:13.499 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.499 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:13.499 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:13.499 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:13.500 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:13.760 [2024-07-15 17:28:24.955407] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:13.760 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:13.760 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:13.760 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.760 17:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:14.021 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:14.021 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:14.021 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:14.282 [2024-07-15 17:28:25.342180] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:14.282 [2024-07-15 17:28:25.342209] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbd8fc0 name Existed_Raid, state offline 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:14.282 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:14.542 BaseBdev2 00:16:14.542 17:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:14.542 17:28:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:14.542 17:28:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:14.542 17:28:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:14.542 17:28:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:14.542 17:28:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:14.542 17:28:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:14.802 17:28:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:15.063 [ 00:16:15.063 { 00:16:15.063 "name": "BaseBdev2", 00:16:15.063 "aliases": [ 00:16:15.063 "435e9b3f-3d3c-42f2-b839-38e40d2e46c1" 00:16:15.063 ], 00:16:15.063 "product_name": "Malloc disk", 00:16:15.063 "block_size": 512, 00:16:15.063 "num_blocks": 65536, 00:16:15.063 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:15.063 "assigned_rate_limits": { 00:16:15.063 "rw_ios_per_sec": 0, 00:16:15.063 "rw_mbytes_per_sec": 0, 00:16:15.063 "r_mbytes_per_sec": 0, 00:16:15.063 "w_mbytes_per_sec": 0 00:16:15.063 }, 00:16:15.063 "claimed": false, 00:16:15.063 "zoned": false, 00:16:15.063 "supported_io_types": { 00:16:15.063 "read": true, 00:16:15.063 "write": true, 00:16:15.063 "unmap": true, 00:16:15.063 "flush": true, 00:16:15.063 "reset": true, 00:16:15.063 "nvme_admin": false, 00:16:15.063 "nvme_io": false, 00:16:15.063 "nvme_io_md": false, 00:16:15.063 "write_zeroes": true, 00:16:15.063 "zcopy": true, 00:16:15.063 "get_zone_info": false, 00:16:15.063 "zone_management": false, 00:16:15.063 "zone_append": false, 00:16:15.063 "compare": false, 00:16:15.063 "compare_and_write": false, 00:16:15.063 "abort": true, 00:16:15.063 "seek_hole": false, 00:16:15.063 "seek_data": false, 00:16:15.063 "copy": true, 00:16:15.063 "nvme_iov_md": false 00:16:15.063 }, 00:16:15.063 "memory_domains": [ 00:16:15.063 { 00:16:15.063 "dma_device_id": "system", 00:16:15.063 "dma_device_type": 1 00:16:15.063 }, 00:16:15.063 { 00:16:15.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.063 "dma_device_type": 2 00:16:15.063 } 00:16:15.063 ], 00:16:15.063 "driver_specific": {} 00:16:15.063 } 00:16:15.063 ] 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:15.063 BaseBdev3 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:15.063 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.323 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:15.608 [ 00:16:15.608 { 00:16:15.608 "name": "BaseBdev3", 00:16:15.608 "aliases": [ 00:16:15.608 "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408" 00:16:15.608 ], 00:16:15.608 "product_name": "Malloc disk", 00:16:15.608 "block_size": 512, 00:16:15.608 "num_blocks": 65536, 00:16:15.608 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:15.608 "assigned_rate_limits": { 00:16:15.608 "rw_ios_per_sec": 0, 00:16:15.608 "rw_mbytes_per_sec": 0, 00:16:15.608 "r_mbytes_per_sec": 0, 00:16:15.608 "w_mbytes_per_sec": 0 00:16:15.608 }, 00:16:15.608 "claimed": false, 00:16:15.608 "zoned": false, 00:16:15.608 "supported_io_types": { 00:16:15.608 "read": true, 00:16:15.608 "write": true, 00:16:15.608 "unmap": true, 00:16:15.608 "flush": true, 00:16:15.608 "reset": true, 00:16:15.608 "nvme_admin": false, 00:16:15.608 "nvme_io": false, 00:16:15.608 "nvme_io_md": false, 00:16:15.608 "write_zeroes": true, 00:16:15.608 "zcopy": true, 00:16:15.608 "get_zone_info": false, 00:16:15.608 "zone_management": false, 00:16:15.608 "zone_append": false, 00:16:15.608 "compare": false, 00:16:15.608 "compare_and_write": false, 00:16:15.608 "abort": true, 00:16:15.608 "seek_hole": false, 00:16:15.608 "seek_data": false, 00:16:15.609 "copy": true, 00:16:15.609 "nvme_iov_md": false 00:16:15.609 }, 00:16:15.609 "memory_domains": [ 00:16:15.609 { 00:16:15.609 "dma_device_id": "system", 00:16:15.609 "dma_device_type": 1 00:16:15.609 }, 00:16:15.609 { 00:16:15.609 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.609 "dma_device_type": 2 00:16:15.609 } 00:16:15.609 ], 00:16:15.609 "driver_specific": {} 00:16:15.609 } 00:16:15.609 ] 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:15.609 BaseBdev4 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:15.609 17:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.868 17:28:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:16.128 [ 00:16:16.128 { 00:16:16.128 "name": "BaseBdev4", 00:16:16.128 "aliases": [ 00:16:16.128 "9bf1bdf6-d433-4550-b84e-7fe0d345458b" 00:16:16.128 ], 00:16:16.128 "product_name": "Malloc disk", 00:16:16.128 "block_size": 512, 00:16:16.128 "num_blocks": 65536, 00:16:16.128 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:16.128 "assigned_rate_limits": { 00:16:16.128 "rw_ios_per_sec": 0, 00:16:16.128 "rw_mbytes_per_sec": 0, 00:16:16.128 "r_mbytes_per_sec": 0, 00:16:16.128 "w_mbytes_per_sec": 0 00:16:16.128 }, 00:16:16.128 "claimed": false, 00:16:16.128 "zoned": false, 00:16:16.128 "supported_io_types": { 00:16:16.128 "read": true, 00:16:16.128 "write": true, 00:16:16.128 "unmap": true, 00:16:16.128 "flush": true, 00:16:16.128 "reset": true, 00:16:16.128 "nvme_admin": false, 00:16:16.128 "nvme_io": false, 00:16:16.128 "nvme_io_md": false, 00:16:16.128 "write_zeroes": true, 00:16:16.128 "zcopy": true, 00:16:16.128 "get_zone_info": false, 00:16:16.128 "zone_management": false, 00:16:16.128 "zone_append": false, 00:16:16.128 "compare": false, 00:16:16.128 "compare_and_write": false, 00:16:16.128 "abort": true, 00:16:16.128 "seek_hole": false, 00:16:16.128 "seek_data": false, 00:16:16.128 "copy": true, 00:16:16.128 "nvme_iov_md": false 00:16:16.128 }, 00:16:16.128 "memory_domains": [ 00:16:16.128 { 00:16:16.128 "dma_device_id": "system", 00:16:16.128 "dma_device_type": 1 00:16:16.128 }, 00:16:16.128 { 00:16:16.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.128 "dma_device_type": 2 00:16:16.128 } 00:16:16.128 ], 00:16:16.128 "driver_specific": {} 00:16:16.128 } 00:16:16.128 ] 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:16.128 [2024-07-15 17:28:27.405469] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:16.128 [2024-07-15 17:28:27.405501] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:16.128 [2024-07-15 17:28:27.405514] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:16.128 [2024-07-15 17:28:27.406549] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:16.128 [2024-07-15 17:28:27.406582] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.128 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.389 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.389 "name": "Existed_Raid", 00:16:16.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:16.389 "strip_size_kb": 64, 00:16:16.389 "state": "configuring", 00:16:16.389 "raid_level": "raid0", 00:16:16.389 "superblock": false, 00:16:16.389 "num_base_bdevs": 4, 00:16:16.389 "num_base_bdevs_discovered": 3, 00:16:16.389 "num_base_bdevs_operational": 4, 00:16:16.389 "base_bdevs_list": [ 00:16:16.389 { 00:16:16.389 "name": "BaseBdev1", 00:16:16.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:16.389 "is_configured": false, 00:16:16.389 "data_offset": 0, 00:16:16.389 "data_size": 0 00:16:16.389 }, 00:16:16.389 { 00:16:16.389 "name": "BaseBdev2", 00:16:16.389 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:16.389 "is_configured": true, 00:16:16.389 "data_offset": 0, 00:16:16.389 "data_size": 65536 00:16:16.389 }, 00:16:16.389 { 00:16:16.389 "name": "BaseBdev3", 00:16:16.389 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:16.389 "is_configured": true, 00:16:16.389 "data_offset": 0, 00:16:16.389 "data_size": 65536 00:16:16.389 }, 00:16:16.389 { 00:16:16.389 "name": "BaseBdev4", 00:16:16.389 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:16.389 "is_configured": true, 00:16:16.389 "data_offset": 0, 00:16:16.389 "data_size": 65536 00:16:16.389 } 00:16:16.389 ] 00:16:16.389 }' 00:16:16.389 17:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.389 17:28:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.961 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:17.221 [2024-07-15 17:28:28.323758] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.221 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.481 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.481 "name": "Existed_Raid", 00:16:17.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.481 "strip_size_kb": 64, 00:16:17.481 "state": "configuring", 00:16:17.481 "raid_level": "raid0", 00:16:17.481 "superblock": false, 00:16:17.481 "num_base_bdevs": 4, 00:16:17.481 "num_base_bdevs_discovered": 2, 00:16:17.481 "num_base_bdevs_operational": 4, 00:16:17.481 "base_bdevs_list": [ 00:16:17.481 { 00:16:17.481 "name": "BaseBdev1", 00:16:17.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.481 "is_configured": false, 00:16:17.481 "data_offset": 0, 00:16:17.481 "data_size": 0 00:16:17.481 }, 00:16:17.481 { 00:16:17.481 "name": null, 00:16:17.481 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:17.481 "is_configured": false, 00:16:17.481 "data_offset": 0, 00:16:17.481 "data_size": 65536 00:16:17.481 }, 00:16:17.481 { 00:16:17.481 "name": "BaseBdev3", 00:16:17.481 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:17.481 "is_configured": true, 00:16:17.481 "data_offset": 0, 00:16:17.481 "data_size": 65536 00:16:17.481 }, 00:16:17.481 { 00:16:17.481 "name": "BaseBdev4", 00:16:17.481 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:17.481 "is_configured": true, 00:16:17.481 "data_offset": 0, 00:16:17.481 "data_size": 65536 00:16:17.481 } 00:16:17.481 ] 00:16:17.481 }' 00:16:17.481 17:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.481 17:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:18.051 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.051 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:18.051 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:18.051 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:18.311 [2024-07-15 17:28:29.451670] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:18.311 BaseBdev1 00:16:18.311 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:18.311 17:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:18.311 17:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:18.311 17:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:18.311 17:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:18.311 17:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:18.311 17:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:18.572 [ 00:16:18.572 { 00:16:18.572 "name": "BaseBdev1", 00:16:18.572 "aliases": [ 00:16:18.572 "138a90fa-f1ee-4930-8303-8561f0aa3532" 00:16:18.572 ], 00:16:18.572 "product_name": "Malloc disk", 00:16:18.572 "block_size": 512, 00:16:18.572 "num_blocks": 65536, 00:16:18.572 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:18.572 "assigned_rate_limits": { 00:16:18.572 "rw_ios_per_sec": 0, 00:16:18.572 "rw_mbytes_per_sec": 0, 00:16:18.572 "r_mbytes_per_sec": 0, 00:16:18.572 "w_mbytes_per_sec": 0 00:16:18.572 }, 00:16:18.572 "claimed": true, 00:16:18.572 "claim_type": "exclusive_write", 00:16:18.572 "zoned": false, 00:16:18.572 "supported_io_types": { 00:16:18.572 "read": true, 00:16:18.572 "write": true, 00:16:18.572 "unmap": true, 00:16:18.572 "flush": true, 00:16:18.572 "reset": true, 00:16:18.572 "nvme_admin": false, 00:16:18.572 "nvme_io": false, 00:16:18.572 "nvme_io_md": false, 00:16:18.572 "write_zeroes": true, 00:16:18.572 "zcopy": true, 00:16:18.572 "get_zone_info": false, 00:16:18.572 "zone_management": false, 00:16:18.572 "zone_append": false, 00:16:18.572 "compare": false, 00:16:18.572 "compare_and_write": false, 00:16:18.572 "abort": true, 00:16:18.572 "seek_hole": false, 00:16:18.572 "seek_data": false, 00:16:18.572 "copy": true, 00:16:18.572 "nvme_iov_md": false 00:16:18.572 }, 00:16:18.572 "memory_domains": [ 00:16:18.572 { 00:16:18.572 "dma_device_id": "system", 00:16:18.572 "dma_device_type": 1 00:16:18.572 }, 00:16:18.572 { 00:16:18.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.572 "dma_device_type": 2 00:16:18.572 } 00:16:18.572 ], 00:16:18.572 "driver_specific": {} 00:16:18.572 } 00:16:18.572 ] 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.572 17:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.832 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.832 "name": "Existed_Raid", 00:16:18.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.832 "strip_size_kb": 64, 00:16:18.832 "state": "configuring", 00:16:18.832 "raid_level": "raid0", 00:16:18.832 "superblock": false, 00:16:18.832 "num_base_bdevs": 4, 00:16:18.832 "num_base_bdevs_discovered": 3, 00:16:18.832 "num_base_bdevs_operational": 4, 00:16:18.832 "base_bdevs_list": [ 00:16:18.832 { 00:16:18.832 "name": "BaseBdev1", 00:16:18.832 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:18.832 "is_configured": true, 00:16:18.832 "data_offset": 0, 00:16:18.832 "data_size": 65536 00:16:18.832 }, 00:16:18.832 { 00:16:18.832 "name": null, 00:16:18.832 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:18.832 "is_configured": false, 00:16:18.832 "data_offset": 0, 00:16:18.832 "data_size": 65536 00:16:18.832 }, 00:16:18.832 { 00:16:18.832 "name": "BaseBdev3", 00:16:18.832 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:18.832 "is_configured": true, 00:16:18.832 "data_offset": 0, 00:16:18.832 "data_size": 65536 00:16:18.832 }, 00:16:18.832 { 00:16:18.832 "name": "BaseBdev4", 00:16:18.832 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:18.832 "is_configured": true, 00:16:18.832 "data_offset": 0, 00:16:18.832 "data_size": 65536 00:16:18.832 } 00:16:18.832 ] 00:16:18.832 }' 00:16:18.832 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.832 17:28:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.402 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.402 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:19.662 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:19.662 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:19.662 [2024-07-15 17:28:30.947484] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.921 17:28:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.921 17:28:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.921 "name": "Existed_Raid", 00:16:19.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.921 "strip_size_kb": 64, 00:16:19.921 "state": "configuring", 00:16:19.921 "raid_level": "raid0", 00:16:19.921 "superblock": false, 00:16:19.921 "num_base_bdevs": 4, 00:16:19.921 "num_base_bdevs_discovered": 2, 00:16:19.921 "num_base_bdevs_operational": 4, 00:16:19.921 "base_bdevs_list": [ 00:16:19.921 { 00:16:19.921 "name": "BaseBdev1", 00:16:19.921 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:19.921 "is_configured": true, 00:16:19.921 "data_offset": 0, 00:16:19.921 "data_size": 65536 00:16:19.921 }, 00:16:19.921 { 00:16:19.921 "name": null, 00:16:19.921 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:19.921 "is_configured": false, 00:16:19.921 "data_offset": 0, 00:16:19.921 "data_size": 65536 00:16:19.921 }, 00:16:19.921 { 00:16:19.921 "name": null, 00:16:19.921 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:19.921 "is_configured": false, 00:16:19.921 "data_offset": 0, 00:16:19.921 "data_size": 65536 00:16:19.921 }, 00:16:19.921 { 00:16:19.921 "name": "BaseBdev4", 00:16:19.921 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:19.921 "is_configured": true, 00:16:19.921 "data_offset": 0, 00:16:19.921 "data_size": 65536 00:16:19.921 } 00:16:19.921 ] 00:16:19.921 }' 00:16:19.921 17:28:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.921 17:28:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.491 17:28:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.491 17:28:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:20.752 17:28:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:20.752 17:28:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:20.752 [2024-07-15 17:28:32.046286] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.012 "name": "Existed_Raid", 00:16:21.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:21.012 "strip_size_kb": 64, 00:16:21.012 "state": "configuring", 00:16:21.012 "raid_level": "raid0", 00:16:21.012 "superblock": false, 00:16:21.012 "num_base_bdevs": 4, 00:16:21.012 "num_base_bdevs_discovered": 3, 00:16:21.012 "num_base_bdevs_operational": 4, 00:16:21.012 "base_bdevs_list": [ 00:16:21.012 { 00:16:21.012 "name": "BaseBdev1", 00:16:21.012 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:21.012 "is_configured": true, 00:16:21.012 "data_offset": 0, 00:16:21.012 "data_size": 65536 00:16:21.012 }, 00:16:21.012 { 00:16:21.012 "name": null, 00:16:21.012 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:21.012 "is_configured": false, 00:16:21.012 "data_offset": 0, 00:16:21.012 "data_size": 65536 00:16:21.012 }, 00:16:21.012 { 00:16:21.012 "name": "BaseBdev3", 00:16:21.012 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:21.012 "is_configured": true, 00:16:21.012 "data_offset": 0, 00:16:21.012 "data_size": 65536 00:16:21.012 }, 00:16:21.012 { 00:16:21.012 "name": "BaseBdev4", 00:16:21.012 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:21.012 "is_configured": true, 00:16:21.012 "data_offset": 0, 00:16:21.012 "data_size": 65536 00:16:21.012 } 00:16:21.012 ] 00:16:21.012 }' 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.012 17:28:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.647 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.647 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:21.907 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:21.907 17:28:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:21.907 [2024-07-15 17:28:33.165127] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.907 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:22.166 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.166 "name": "Existed_Raid", 00:16:22.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.166 "strip_size_kb": 64, 00:16:22.166 "state": "configuring", 00:16:22.166 "raid_level": "raid0", 00:16:22.166 "superblock": false, 00:16:22.166 "num_base_bdevs": 4, 00:16:22.166 "num_base_bdevs_discovered": 2, 00:16:22.166 "num_base_bdevs_operational": 4, 00:16:22.166 "base_bdevs_list": [ 00:16:22.166 { 00:16:22.166 "name": null, 00:16:22.166 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:22.166 "is_configured": false, 00:16:22.166 "data_offset": 0, 00:16:22.166 "data_size": 65536 00:16:22.166 }, 00:16:22.166 { 00:16:22.166 "name": null, 00:16:22.166 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:22.166 "is_configured": false, 00:16:22.166 "data_offset": 0, 00:16:22.166 "data_size": 65536 00:16:22.166 }, 00:16:22.166 { 00:16:22.166 "name": "BaseBdev3", 00:16:22.166 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:22.166 "is_configured": true, 00:16:22.166 "data_offset": 0, 00:16:22.166 "data_size": 65536 00:16:22.166 }, 00:16:22.166 { 00:16:22.166 "name": "BaseBdev4", 00:16:22.166 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:22.166 "is_configured": true, 00:16:22.166 "data_offset": 0, 00:16:22.166 "data_size": 65536 00:16:22.166 } 00:16:22.166 ] 00:16:22.166 }' 00:16:22.166 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.166 17:28:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.737 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.737 17:28:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:22.997 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:22.997 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:22.997 [2024-07-15 17:28:34.289830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.258 "name": "Existed_Raid", 00:16:23.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.258 "strip_size_kb": 64, 00:16:23.258 "state": "configuring", 00:16:23.258 "raid_level": "raid0", 00:16:23.258 "superblock": false, 00:16:23.258 "num_base_bdevs": 4, 00:16:23.258 "num_base_bdevs_discovered": 3, 00:16:23.258 "num_base_bdevs_operational": 4, 00:16:23.258 "base_bdevs_list": [ 00:16:23.258 { 00:16:23.258 "name": null, 00:16:23.258 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:23.258 "is_configured": false, 00:16:23.258 "data_offset": 0, 00:16:23.258 "data_size": 65536 00:16:23.258 }, 00:16:23.258 { 00:16:23.258 "name": "BaseBdev2", 00:16:23.258 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:23.258 "is_configured": true, 00:16:23.258 "data_offset": 0, 00:16:23.258 "data_size": 65536 00:16:23.258 }, 00:16:23.258 { 00:16:23.258 "name": "BaseBdev3", 00:16:23.258 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:23.258 "is_configured": true, 00:16:23.258 "data_offset": 0, 00:16:23.258 "data_size": 65536 00:16:23.258 }, 00:16:23.258 { 00:16:23.258 "name": "BaseBdev4", 00:16:23.258 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:23.258 "is_configured": true, 00:16:23.258 "data_offset": 0, 00:16:23.258 "data_size": 65536 00:16:23.258 } 00:16:23.258 ] 00:16:23.258 }' 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.258 17:28:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.830 17:28:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.830 17:28:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:24.091 17:28:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:24.091 17:28:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.091 17:28:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:24.351 17:28:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 138a90fa-f1ee-4930-8303-8561f0aa3532 00:16:24.351 [2024-07-15 17:28:35.618236] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:24.351 [2024-07-15 17:28:35.618264] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbd8ba0 00:16:24.352 [2024-07-15 17:28:35.618268] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:24.352 [2024-07-15 17:28:35.618416] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbddbd0 00:16:24.352 [2024-07-15 17:28:35.618513] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbd8ba0 00:16:24.352 [2024-07-15 17:28:35.618519] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbd8ba0 00:16:24.352 [2024-07-15 17:28:35.618638] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:24.352 NewBaseBdev 00:16:24.352 17:28:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:24.352 17:28:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:24.352 17:28:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:24.352 17:28:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:24.352 17:28:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:24.352 17:28:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:24.352 17:28:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:24.612 17:28:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:24.872 [ 00:16:24.872 { 00:16:24.872 "name": "NewBaseBdev", 00:16:24.872 "aliases": [ 00:16:24.872 "138a90fa-f1ee-4930-8303-8561f0aa3532" 00:16:24.872 ], 00:16:24.872 "product_name": "Malloc disk", 00:16:24.872 "block_size": 512, 00:16:24.872 "num_blocks": 65536, 00:16:24.872 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:24.872 "assigned_rate_limits": { 00:16:24.872 "rw_ios_per_sec": 0, 00:16:24.872 "rw_mbytes_per_sec": 0, 00:16:24.872 "r_mbytes_per_sec": 0, 00:16:24.872 "w_mbytes_per_sec": 0 00:16:24.872 }, 00:16:24.872 "claimed": true, 00:16:24.872 "claim_type": "exclusive_write", 00:16:24.872 "zoned": false, 00:16:24.872 "supported_io_types": { 00:16:24.872 "read": true, 00:16:24.872 "write": true, 00:16:24.872 "unmap": true, 00:16:24.872 "flush": true, 00:16:24.872 "reset": true, 00:16:24.872 "nvme_admin": false, 00:16:24.872 "nvme_io": false, 00:16:24.872 "nvme_io_md": false, 00:16:24.872 "write_zeroes": true, 00:16:24.872 "zcopy": true, 00:16:24.872 "get_zone_info": false, 00:16:24.872 "zone_management": false, 00:16:24.872 "zone_append": false, 00:16:24.872 "compare": false, 00:16:24.872 "compare_and_write": false, 00:16:24.872 "abort": true, 00:16:24.872 "seek_hole": false, 00:16:24.872 "seek_data": false, 00:16:24.872 "copy": true, 00:16:24.872 "nvme_iov_md": false 00:16:24.872 }, 00:16:24.872 "memory_domains": [ 00:16:24.872 { 00:16:24.872 "dma_device_id": "system", 00:16:24.872 "dma_device_type": 1 00:16:24.872 }, 00:16:24.872 { 00:16:24.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.872 "dma_device_type": 2 00:16:24.872 } 00:16:24.872 ], 00:16:24.872 "driver_specific": {} 00:16:24.872 } 00:16:24.872 ] 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:24.872 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.132 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.132 "name": "Existed_Raid", 00:16:25.132 "uuid": "f13dfde4-52fb-433f-a9f7-9e4752983161", 00:16:25.132 "strip_size_kb": 64, 00:16:25.132 "state": "online", 00:16:25.132 "raid_level": "raid0", 00:16:25.132 "superblock": false, 00:16:25.132 "num_base_bdevs": 4, 00:16:25.132 "num_base_bdevs_discovered": 4, 00:16:25.132 "num_base_bdevs_operational": 4, 00:16:25.132 "base_bdevs_list": [ 00:16:25.132 { 00:16:25.132 "name": "NewBaseBdev", 00:16:25.132 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:25.132 "is_configured": true, 00:16:25.132 "data_offset": 0, 00:16:25.132 "data_size": 65536 00:16:25.132 }, 00:16:25.132 { 00:16:25.132 "name": "BaseBdev2", 00:16:25.132 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:25.132 "is_configured": true, 00:16:25.132 "data_offset": 0, 00:16:25.132 "data_size": 65536 00:16:25.132 }, 00:16:25.132 { 00:16:25.132 "name": "BaseBdev3", 00:16:25.132 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:25.132 "is_configured": true, 00:16:25.132 "data_offset": 0, 00:16:25.132 "data_size": 65536 00:16:25.132 }, 00:16:25.132 { 00:16:25.132 "name": "BaseBdev4", 00:16:25.132 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:25.132 "is_configured": true, 00:16:25.132 "data_offset": 0, 00:16:25.132 "data_size": 65536 00:16:25.132 } 00:16:25.132 ] 00:16:25.132 }' 00:16:25.132 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.132 17:28:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:25.705 [2024-07-15 17:28:36.901779] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:25.705 "name": "Existed_Raid", 00:16:25.705 "aliases": [ 00:16:25.705 "f13dfde4-52fb-433f-a9f7-9e4752983161" 00:16:25.705 ], 00:16:25.705 "product_name": "Raid Volume", 00:16:25.705 "block_size": 512, 00:16:25.705 "num_blocks": 262144, 00:16:25.705 "uuid": "f13dfde4-52fb-433f-a9f7-9e4752983161", 00:16:25.705 "assigned_rate_limits": { 00:16:25.705 "rw_ios_per_sec": 0, 00:16:25.705 "rw_mbytes_per_sec": 0, 00:16:25.705 "r_mbytes_per_sec": 0, 00:16:25.705 "w_mbytes_per_sec": 0 00:16:25.705 }, 00:16:25.705 "claimed": false, 00:16:25.705 "zoned": false, 00:16:25.705 "supported_io_types": { 00:16:25.705 "read": true, 00:16:25.705 "write": true, 00:16:25.705 "unmap": true, 00:16:25.705 "flush": true, 00:16:25.705 "reset": true, 00:16:25.705 "nvme_admin": false, 00:16:25.705 "nvme_io": false, 00:16:25.705 "nvme_io_md": false, 00:16:25.705 "write_zeroes": true, 00:16:25.705 "zcopy": false, 00:16:25.705 "get_zone_info": false, 00:16:25.705 "zone_management": false, 00:16:25.705 "zone_append": false, 00:16:25.705 "compare": false, 00:16:25.705 "compare_and_write": false, 00:16:25.705 "abort": false, 00:16:25.705 "seek_hole": false, 00:16:25.705 "seek_data": false, 00:16:25.705 "copy": false, 00:16:25.705 "nvme_iov_md": false 00:16:25.705 }, 00:16:25.705 "memory_domains": [ 00:16:25.705 { 00:16:25.705 "dma_device_id": "system", 00:16:25.705 "dma_device_type": 1 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.705 "dma_device_type": 2 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "dma_device_id": "system", 00:16:25.705 "dma_device_type": 1 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.705 "dma_device_type": 2 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "dma_device_id": "system", 00:16:25.705 "dma_device_type": 1 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.705 "dma_device_type": 2 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "dma_device_id": "system", 00:16:25.705 "dma_device_type": 1 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.705 "dma_device_type": 2 00:16:25.705 } 00:16:25.705 ], 00:16:25.705 "driver_specific": { 00:16:25.705 "raid": { 00:16:25.705 "uuid": "f13dfde4-52fb-433f-a9f7-9e4752983161", 00:16:25.705 "strip_size_kb": 64, 00:16:25.705 "state": "online", 00:16:25.705 "raid_level": "raid0", 00:16:25.705 "superblock": false, 00:16:25.705 "num_base_bdevs": 4, 00:16:25.705 "num_base_bdevs_discovered": 4, 00:16:25.705 "num_base_bdevs_operational": 4, 00:16:25.705 "base_bdevs_list": [ 00:16:25.705 { 00:16:25.705 "name": "NewBaseBdev", 00:16:25.705 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:25.705 "is_configured": true, 00:16:25.705 "data_offset": 0, 00:16:25.705 "data_size": 65536 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "name": "BaseBdev2", 00:16:25.705 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:25.705 "is_configured": true, 00:16:25.705 "data_offset": 0, 00:16:25.705 "data_size": 65536 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "name": "BaseBdev3", 00:16:25.705 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:25.705 "is_configured": true, 00:16:25.705 "data_offset": 0, 00:16:25.705 "data_size": 65536 00:16:25.705 }, 00:16:25.705 { 00:16:25.705 "name": "BaseBdev4", 00:16:25.705 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:25.705 "is_configured": true, 00:16:25.705 "data_offset": 0, 00:16:25.705 "data_size": 65536 00:16:25.705 } 00:16:25.705 ] 00:16:25.705 } 00:16:25.705 } 00:16:25.705 }' 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:25.705 BaseBdev2 00:16:25.705 BaseBdev3 00:16:25.705 BaseBdev4' 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:25.705 17:28:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.967 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.967 "name": "NewBaseBdev", 00:16:25.967 "aliases": [ 00:16:25.967 "138a90fa-f1ee-4930-8303-8561f0aa3532" 00:16:25.967 ], 00:16:25.967 "product_name": "Malloc disk", 00:16:25.967 "block_size": 512, 00:16:25.967 "num_blocks": 65536, 00:16:25.967 "uuid": "138a90fa-f1ee-4930-8303-8561f0aa3532", 00:16:25.967 "assigned_rate_limits": { 00:16:25.967 "rw_ios_per_sec": 0, 00:16:25.967 "rw_mbytes_per_sec": 0, 00:16:25.967 "r_mbytes_per_sec": 0, 00:16:25.967 "w_mbytes_per_sec": 0 00:16:25.967 }, 00:16:25.967 "claimed": true, 00:16:25.967 "claim_type": "exclusive_write", 00:16:25.967 "zoned": false, 00:16:25.967 "supported_io_types": { 00:16:25.967 "read": true, 00:16:25.967 "write": true, 00:16:25.967 "unmap": true, 00:16:25.967 "flush": true, 00:16:25.967 "reset": true, 00:16:25.967 "nvme_admin": false, 00:16:25.967 "nvme_io": false, 00:16:25.967 "nvme_io_md": false, 00:16:25.967 "write_zeroes": true, 00:16:25.967 "zcopy": true, 00:16:25.967 "get_zone_info": false, 00:16:25.967 "zone_management": false, 00:16:25.967 "zone_append": false, 00:16:25.967 "compare": false, 00:16:25.967 "compare_and_write": false, 00:16:25.967 "abort": true, 00:16:25.967 "seek_hole": false, 00:16:25.967 "seek_data": false, 00:16:25.967 "copy": true, 00:16:25.967 "nvme_iov_md": false 00:16:25.967 }, 00:16:25.967 "memory_domains": [ 00:16:25.967 { 00:16:25.967 "dma_device_id": "system", 00:16:25.967 "dma_device_type": 1 00:16:25.967 }, 00:16:25.967 { 00:16:25.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.967 "dma_device_type": 2 00:16:25.967 } 00:16:25.967 ], 00:16:25.967 "driver_specific": {} 00:16:25.967 }' 00:16:25.967 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.967 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.967 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.967 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:26.227 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:26.488 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:26.488 "name": "BaseBdev2", 00:16:26.488 "aliases": [ 00:16:26.488 "435e9b3f-3d3c-42f2-b839-38e40d2e46c1" 00:16:26.488 ], 00:16:26.488 "product_name": "Malloc disk", 00:16:26.488 "block_size": 512, 00:16:26.488 "num_blocks": 65536, 00:16:26.488 "uuid": "435e9b3f-3d3c-42f2-b839-38e40d2e46c1", 00:16:26.488 "assigned_rate_limits": { 00:16:26.488 "rw_ios_per_sec": 0, 00:16:26.488 "rw_mbytes_per_sec": 0, 00:16:26.488 "r_mbytes_per_sec": 0, 00:16:26.488 "w_mbytes_per_sec": 0 00:16:26.488 }, 00:16:26.488 "claimed": true, 00:16:26.488 "claim_type": "exclusive_write", 00:16:26.488 "zoned": false, 00:16:26.488 "supported_io_types": { 00:16:26.488 "read": true, 00:16:26.488 "write": true, 00:16:26.488 "unmap": true, 00:16:26.488 "flush": true, 00:16:26.488 "reset": true, 00:16:26.488 "nvme_admin": false, 00:16:26.488 "nvme_io": false, 00:16:26.488 "nvme_io_md": false, 00:16:26.488 "write_zeroes": true, 00:16:26.488 "zcopy": true, 00:16:26.488 "get_zone_info": false, 00:16:26.488 "zone_management": false, 00:16:26.488 "zone_append": false, 00:16:26.488 "compare": false, 00:16:26.488 "compare_and_write": false, 00:16:26.488 "abort": true, 00:16:26.488 "seek_hole": false, 00:16:26.488 "seek_data": false, 00:16:26.488 "copy": true, 00:16:26.488 "nvme_iov_md": false 00:16:26.488 }, 00:16:26.488 "memory_domains": [ 00:16:26.488 { 00:16:26.488 "dma_device_id": "system", 00:16:26.488 "dma_device_type": 1 00:16:26.488 }, 00:16:26.488 { 00:16:26.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.488 "dma_device_type": 2 00:16:26.488 } 00:16:26.488 ], 00:16:26.488 "driver_specific": {} 00:16:26.488 }' 00:16:26.488 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.488 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.488 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:26.488 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.748 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.748 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:26.748 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.748 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.748 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.748 17:28:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.748 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.748 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.009 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:27.009 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:27.009 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:27.009 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:27.009 "name": "BaseBdev3", 00:16:27.009 "aliases": [ 00:16:27.009 "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408" 00:16:27.009 ], 00:16:27.009 "product_name": "Malloc disk", 00:16:27.009 "block_size": 512, 00:16:27.009 "num_blocks": 65536, 00:16:27.009 "uuid": "e0f2c56b-8747-4b1b-8f3c-65f9bb7bb408", 00:16:27.009 "assigned_rate_limits": { 00:16:27.009 "rw_ios_per_sec": 0, 00:16:27.009 "rw_mbytes_per_sec": 0, 00:16:27.009 "r_mbytes_per_sec": 0, 00:16:27.009 "w_mbytes_per_sec": 0 00:16:27.009 }, 00:16:27.009 "claimed": true, 00:16:27.009 "claim_type": "exclusive_write", 00:16:27.009 "zoned": false, 00:16:27.009 "supported_io_types": { 00:16:27.009 "read": true, 00:16:27.009 "write": true, 00:16:27.009 "unmap": true, 00:16:27.009 "flush": true, 00:16:27.009 "reset": true, 00:16:27.009 "nvme_admin": false, 00:16:27.009 "nvme_io": false, 00:16:27.009 "nvme_io_md": false, 00:16:27.009 "write_zeroes": true, 00:16:27.009 "zcopy": true, 00:16:27.009 "get_zone_info": false, 00:16:27.009 "zone_management": false, 00:16:27.009 "zone_append": false, 00:16:27.009 "compare": false, 00:16:27.009 "compare_and_write": false, 00:16:27.009 "abort": true, 00:16:27.009 "seek_hole": false, 00:16:27.009 "seek_data": false, 00:16:27.009 "copy": true, 00:16:27.009 "nvme_iov_md": false 00:16:27.009 }, 00:16:27.009 "memory_domains": [ 00:16:27.009 { 00:16:27.009 "dma_device_id": "system", 00:16:27.009 "dma_device_type": 1 00:16:27.009 }, 00:16:27.009 { 00:16:27.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.009 "dma_device_type": 2 00:16:27.009 } 00:16:27.009 ], 00:16:27.009 "driver_specific": {} 00:16:27.009 }' 00:16:27.009 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.009 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.270 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:27.270 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.270 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.270 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:27.270 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.270 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.270 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:27.270 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.270 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.530 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.530 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:27.530 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:27.530 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:27.530 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:27.530 "name": "BaseBdev4", 00:16:27.530 "aliases": [ 00:16:27.530 "9bf1bdf6-d433-4550-b84e-7fe0d345458b" 00:16:27.530 ], 00:16:27.530 "product_name": "Malloc disk", 00:16:27.530 "block_size": 512, 00:16:27.530 "num_blocks": 65536, 00:16:27.530 "uuid": "9bf1bdf6-d433-4550-b84e-7fe0d345458b", 00:16:27.530 "assigned_rate_limits": { 00:16:27.530 "rw_ios_per_sec": 0, 00:16:27.530 "rw_mbytes_per_sec": 0, 00:16:27.530 "r_mbytes_per_sec": 0, 00:16:27.530 "w_mbytes_per_sec": 0 00:16:27.530 }, 00:16:27.530 "claimed": true, 00:16:27.530 "claim_type": "exclusive_write", 00:16:27.530 "zoned": false, 00:16:27.530 "supported_io_types": { 00:16:27.530 "read": true, 00:16:27.530 "write": true, 00:16:27.530 "unmap": true, 00:16:27.530 "flush": true, 00:16:27.530 "reset": true, 00:16:27.530 "nvme_admin": false, 00:16:27.530 "nvme_io": false, 00:16:27.530 "nvme_io_md": false, 00:16:27.530 "write_zeroes": true, 00:16:27.530 "zcopy": true, 00:16:27.530 "get_zone_info": false, 00:16:27.530 "zone_management": false, 00:16:27.530 "zone_append": false, 00:16:27.530 "compare": false, 00:16:27.530 "compare_and_write": false, 00:16:27.530 "abort": true, 00:16:27.530 "seek_hole": false, 00:16:27.530 "seek_data": false, 00:16:27.530 "copy": true, 00:16:27.530 "nvme_iov_md": false 00:16:27.530 }, 00:16:27.530 "memory_domains": [ 00:16:27.530 { 00:16:27.530 "dma_device_id": "system", 00:16:27.530 "dma_device_type": 1 00:16:27.530 }, 00:16:27.530 { 00:16:27.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.531 "dma_device_type": 2 00:16:27.531 } 00:16:27.531 ], 00:16:27.531 "driver_specific": {} 00:16:27.531 }' 00:16:27.531 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.791 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.791 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:27.791 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.791 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.791 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:27.791 17:28:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.791 17:28:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.791 17:28:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:27.791 17:28:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.052 17:28:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.052 17:28:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:28.052 17:28:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:28.052 [2024-07-15 17:28:39.307613] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:28.052 [2024-07-15 17:28:39.307640] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:28.052 [2024-07-15 17:28:39.307679] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:28.052 [2024-07-15 17:28:39.307727] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:28.052 [2024-07-15 17:28:39.307734] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbd8ba0 name Existed_Raid, state offline 00:16:28.052 17:28:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2807077 00:16:28.052 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2807077 ']' 00:16:28.052 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2807077 00:16:28.052 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:28.052 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:28.052 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2807077 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2807077' 00:16:28.312 killing process with pid 2807077 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2807077 00:16:28.312 [2024-07-15 17:28:39.376619] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2807077 00:16:28.312 [2024-07-15 17:28:39.397181] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:28.312 00:16:28.312 real 0m27.163s 00:16:28.312 user 0m51.030s 00:16:28.312 sys 0m3.975s 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.312 ************************************ 00:16:28.312 END TEST raid_state_function_test 00:16:28.312 ************************************ 00:16:28.312 17:28:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:28.312 17:28:39 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:16:28.312 17:28:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:28.312 17:28:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:28.312 17:28:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:28.312 ************************************ 00:16:28.312 START TEST raid_state_function_test_sb 00:16:28.312 ************************************ 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:16:28.312 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2812334 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2812334' 00:16:28.313 Process raid pid: 2812334 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2812334 /var/tmp/spdk-raid.sock 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2812334 ']' 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:28.313 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:28.313 17:28:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:28.573 [2024-07-15 17:28:39.656776] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:16:28.573 [2024-07-15 17:28:39.656825] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:28.573 [2024-07-15 17:28:39.745592] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:28.573 [2024-07-15 17:28:39.808934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:28.573 [2024-07-15 17:28:39.850690] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.573 [2024-07-15 17:28:39.850717] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:29.513 [2024-07-15 17:28:40.665727] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:29.513 [2024-07-15 17:28:40.665758] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:29.513 [2024-07-15 17:28:40.665764] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:29.513 [2024-07-15 17:28:40.665770] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:29.513 [2024-07-15 17:28:40.665775] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:29.513 [2024-07-15 17:28:40.665780] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:29.513 [2024-07-15 17:28:40.665785] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:29.513 [2024-07-15 17:28:40.665790] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.513 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.774 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.774 "name": "Existed_Raid", 00:16:29.774 "uuid": "bd6cd835-15fa-42b7-86f1-70be03f8844f", 00:16:29.774 "strip_size_kb": 64, 00:16:29.774 "state": "configuring", 00:16:29.774 "raid_level": "raid0", 00:16:29.774 "superblock": true, 00:16:29.774 "num_base_bdevs": 4, 00:16:29.774 "num_base_bdevs_discovered": 0, 00:16:29.774 "num_base_bdevs_operational": 4, 00:16:29.774 "base_bdevs_list": [ 00:16:29.774 { 00:16:29.774 "name": "BaseBdev1", 00:16:29.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.774 "is_configured": false, 00:16:29.774 "data_offset": 0, 00:16:29.774 "data_size": 0 00:16:29.774 }, 00:16:29.774 { 00:16:29.774 "name": "BaseBdev2", 00:16:29.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.774 "is_configured": false, 00:16:29.774 "data_offset": 0, 00:16:29.774 "data_size": 0 00:16:29.774 }, 00:16:29.774 { 00:16:29.774 "name": "BaseBdev3", 00:16:29.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.774 "is_configured": false, 00:16:29.774 "data_offset": 0, 00:16:29.774 "data_size": 0 00:16:29.774 }, 00:16:29.775 { 00:16:29.775 "name": "BaseBdev4", 00:16:29.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.775 "is_configured": false, 00:16:29.775 "data_offset": 0, 00:16:29.775 "data_size": 0 00:16:29.775 } 00:16:29.775 ] 00:16:29.775 }' 00:16:29.775 17:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.775 17:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:30.346 17:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:30.346 [2024-07-15 17:28:41.547845] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:30.346 [2024-07-15 17:28:41.547865] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a56f0 name Existed_Raid, state configuring 00:16:30.346 17:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:30.606 [2024-07-15 17:28:41.744363] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:30.606 [2024-07-15 17:28:41.744379] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:30.606 [2024-07-15 17:28:41.744384] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:30.606 [2024-07-15 17:28:41.744390] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:30.606 [2024-07-15 17:28:41.744394] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:30.606 [2024-07-15 17:28:41.744400] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:30.606 [2024-07-15 17:28:41.744405] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:30.606 [2024-07-15 17:28:41.744410] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:30.606 17:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:30.866 [2024-07-15 17:28:41.911342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:30.866 BaseBdev1 00:16:30.866 17:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:30.866 17:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:30.866 17:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:30.866 17:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:30.866 17:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:30.866 17:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:30.866 17:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.866 17:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:31.126 [ 00:16:31.126 { 00:16:31.126 "name": "BaseBdev1", 00:16:31.126 "aliases": [ 00:16:31.126 "7c560616-a82c-4194-ae3e-cd524bd53ba6" 00:16:31.126 ], 00:16:31.126 "product_name": "Malloc disk", 00:16:31.126 "block_size": 512, 00:16:31.126 "num_blocks": 65536, 00:16:31.126 "uuid": "7c560616-a82c-4194-ae3e-cd524bd53ba6", 00:16:31.126 "assigned_rate_limits": { 00:16:31.126 "rw_ios_per_sec": 0, 00:16:31.126 "rw_mbytes_per_sec": 0, 00:16:31.126 "r_mbytes_per_sec": 0, 00:16:31.126 "w_mbytes_per_sec": 0 00:16:31.126 }, 00:16:31.126 "claimed": true, 00:16:31.126 "claim_type": "exclusive_write", 00:16:31.127 "zoned": false, 00:16:31.127 "supported_io_types": { 00:16:31.127 "read": true, 00:16:31.127 "write": true, 00:16:31.127 "unmap": true, 00:16:31.127 "flush": true, 00:16:31.127 "reset": true, 00:16:31.127 "nvme_admin": false, 00:16:31.127 "nvme_io": false, 00:16:31.127 "nvme_io_md": false, 00:16:31.127 "write_zeroes": true, 00:16:31.127 "zcopy": true, 00:16:31.127 "get_zone_info": false, 00:16:31.127 "zone_management": false, 00:16:31.127 "zone_append": false, 00:16:31.127 "compare": false, 00:16:31.127 "compare_and_write": false, 00:16:31.127 "abort": true, 00:16:31.127 "seek_hole": false, 00:16:31.127 "seek_data": false, 00:16:31.127 "copy": true, 00:16:31.127 "nvme_iov_md": false 00:16:31.127 }, 00:16:31.127 "memory_domains": [ 00:16:31.127 { 00:16:31.127 "dma_device_id": "system", 00:16:31.127 "dma_device_type": 1 00:16:31.127 }, 00:16:31.127 { 00:16:31.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.127 "dma_device_type": 2 00:16:31.127 } 00:16:31.127 ], 00:16:31.127 "driver_specific": {} 00:16:31.127 } 00:16:31.127 ] 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.127 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.386 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.386 "name": "Existed_Raid", 00:16:31.386 "uuid": "9d8043c1-5409-480a-9ee8-a47e7d9e4c84", 00:16:31.386 "strip_size_kb": 64, 00:16:31.386 "state": "configuring", 00:16:31.386 "raid_level": "raid0", 00:16:31.386 "superblock": true, 00:16:31.386 "num_base_bdevs": 4, 00:16:31.386 "num_base_bdevs_discovered": 1, 00:16:31.386 "num_base_bdevs_operational": 4, 00:16:31.386 "base_bdevs_list": [ 00:16:31.386 { 00:16:31.386 "name": "BaseBdev1", 00:16:31.386 "uuid": "7c560616-a82c-4194-ae3e-cd524bd53ba6", 00:16:31.386 "is_configured": true, 00:16:31.386 "data_offset": 2048, 00:16:31.386 "data_size": 63488 00:16:31.386 }, 00:16:31.386 { 00:16:31.386 "name": "BaseBdev2", 00:16:31.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.386 "is_configured": false, 00:16:31.386 "data_offset": 0, 00:16:31.386 "data_size": 0 00:16:31.386 }, 00:16:31.386 { 00:16:31.386 "name": "BaseBdev3", 00:16:31.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.386 "is_configured": false, 00:16:31.386 "data_offset": 0, 00:16:31.386 "data_size": 0 00:16:31.386 }, 00:16:31.386 { 00:16:31.386 "name": "BaseBdev4", 00:16:31.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.386 "is_configured": false, 00:16:31.386 "data_offset": 0, 00:16:31.386 "data_size": 0 00:16:31.386 } 00:16:31.386 ] 00:16:31.386 }' 00:16:31.386 17:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.386 17:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.956 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:31.956 [2024-07-15 17:28:43.226659] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:31.956 [2024-07-15 17:28:43.226683] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a4f60 name Existed_Raid, state configuring 00:16:31.956 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:32.216 [2024-07-15 17:28:43.419182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:32.216 [2024-07-15 17:28:43.420264] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:32.216 [2024-07-15 17:28:43.420287] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:32.216 [2024-07-15 17:28:43.420292] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:32.216 [2024-07-15 17:28:43.420298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:32.216 [2024-07-15 17:28:43.420303] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:32.216 [2024-07-15 17:28:43.420308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.216 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.476 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.476 "name": "Existed_Raid", 00:16:32.476 "uuid": "d4685d26-2af0-4801-9b6e-42b75d101327", 00:16:32.476 "strip_size_kb": 64, 00:16:32.476 "state": "configuring", 00:16:32.476 "raid_level": "raid0", 00:16:32.476 "superblock": true, 00:16:32.476 "num_base_bdevs": 4, 00:16:32.476 "num_base_bdevs_discovered": 1, 00:16:32.476 "num_base_bdevs_operational": 4, 00:16:32.476 "base_bdevs_list": [ 00:16:32.476 { 00:16:32.476 "name": "BaseBdev1", 00:16:32.476 "uuid": "7c560616-a82c-4194-ae3e-cd524bd53ba6", 00:16:32.476 "is_configured": true, 00:16:32.476 "data_offset": 2048, 00:16:32.476 "data_size": 63488 00:16:32.476 }, 00:16:32.476 { 00:16:32.476 "name": "BaseBdev2", 00:16:32.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.476 "is_configured": false, 00:16:32.476 "data_offset": 0, 00:16:32.476 "data_size": 0 00:16:32.476 }, 00:16:32.476 { 00:16:32.476 "name": "BaseBdev3", 00:16:32.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.476 "is_configured": false, 00:16:32.476 "data_offset": 0, 00:16:32.476 "data_size": 0 00:16:32.476 }, 00:16:32.476 { 00:16:32.476 "name": "BaseBdev4", 00:16:32.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.476 "is_configured": false, 00:16:32.476 "data_offset": 0, 00:16:32.476 "data_size": 0 00:16:32.476 } 00:16:32.476 ] 00:16:32.476 }' 00:16:32.476 17:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.476 17:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:33.044 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:33.303 [2024-07-15 17:28:44.438756] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:33.303 BaseBdev2 00:16:33.303 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:33.303 17:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:33.303 17:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:33.303 17:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:33.303 17:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:33.303 17:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:33.303 17:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:33.562 [ 00:16:33.562 { 00:16:33.562 "name": "BaseBdev2", 00:16:33.562 "aliases": [ 00:16:33.562 "04d4df6e-2dc3-49b9-8af7-2a0e36d7c9e6" 00:16:33.562 ], 00:16:33.562 "product_name": "Malloc disk", 00:16:33.562 "block_size": 512, 00:16:33.562 "num_blocks": 65536, 00:16:33.562 "uuid": "04d4df6e-2dc3-49b9-8af7-2a0e36d7c9e6", 00:16:33.562 "assigned_rate_limits": { 00:16:33.562 "rw_ios_per_sec": 0, 00:16:33.562 "rw_mbytes_per_sec": 0, 00:16:33.562 "r_mbytes_per_sec": 0, 00:16:33.562 "w_mbytes_per_sec": 0 00:16:33.562 }, 00:16:33.562 "claimed": true, 00:16:33.562 "claim_type": "exclusive_write", 00:16:33.562 "zoned": false, 00:16:33.562 "supported_io_types": { 00:16:33.562 "read": true, 00:16:33.562 "write": true, 00:16:33.562 "unmap": true, 00:16:33.562 "flush": true, 00:16:33.562 "reset": true, 00:16:33.562 "nvme_admin": false, 00:16:33.562 "nvme_io": false, 00:16:33.562 "nvme_io_md": false, 00:16:33.562 "write_zeroes": true, 00:16:33.562 "zcopy": true, 00:16:33.562 "get_zone_info": false, 00:16:33.562 "zone_management": false, 00:16:33.562 "zone_append": false, 00:16:33.562 "compare": false, 00:16:33.562 "compare_and_write": false, 00:16:33.562 "abort": true, 00:16:33.562 "seek_hole": false, 00:16:33.562 "seek_data": false, 00:16:33.562 "copy": true, 00:16:33.562 "nvme_iov_md": false 00:16:33.562 }, 00:16:33.562 "memory_domains": [ 00:16:33.562 { 00:16:33.562 "dma_device_id": "system", 00:16:33.562 "dma_device_type": 1 00:16:33.562 }, 00:16:33.562 { 00:16:33.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.562 "dma_device_type": 2 00:16:33.562 } 00:16:33.562 ], 00:16:33.562 "driver_specific": {} 00:16:33.562 } 00:16:33.562 ] 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.562 17:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.130 17:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.130 "name": "Existed_Raid", 00:16:34.130 "uuid": "d4685d26-2af0-4801-9b6e-42b75d101327", 00:16:34.130 "strip_size_kb": 64, 00:16:34.130 "state": "configuring", 00:16:34.130 "raid_level": "raid0", 00:16:34.130 "superblock": true, 00:16:34.130 "num_base_bdevs": 4, 00:16:34.130 "num_base_bdevs_discovered": 2, 00:16:34.130 "num_base_bdevs_operational": 4, 00:16:34.130 "base_bdevs_list": [ 00:16:34.130 { 00:16:34.130 "name": "BaseBdev1", 00:16:34.130 "uuid": "7c560616-a82c-4194-ae3e-cd524bd53ba6", 00:16:34.130 "is_configured": true, 00:16:34.130 "data_offset": 2048, 00:16:34.130 "data_size": 63488 00:16:34.130 }, 00:16:34.130 { 00:16:34.130 "name": "BaseBdev2", 00:16:34.130 "uuid": "04d4df6e-2dc3-49b9-8af7-2a0e36d7c9e6", 00:16:34.130 "is_configured": true, 00:16:34.130 "data_offset": 2048, 00:16:34.130 "data_size": 63488 00:16:34.130 }, 00:16:34.130 { 00:16:34.130 "name": "BaseBdev3", 00:16:34.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.130 "is_configured": false, 00:16:34.130 "data_offset": 0, 00:16:34.130 "data_size": 0 00:16:34.130 }, 00:16:34.130 { 00:16:34.130 "name": "BaseBdev4", 00:16:34.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.130 "is_configured": false, 00:16:34.130 "data_offset": 0, 00:16:34.130 "data_size": 0 00:16:34.130 } 00:16:34.130 ] 00:16:34.130 }' 00:16:34.130 17:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.130 17:28:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:34.699 17:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:34.959 [2024-07-15 17:28:46.160173] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:34.959 BaseBdev3 00:16:34.959 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:34.959 17:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:34.959 17:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:34.959 17:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:34.959 17:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:34.959 17:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:34.959 17:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:35.218 17:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:35.478 [ 00:16:35.478 { 00:16:35.478 "name": "BaseBdev3", 00:16:35.478 "aliases": [ 00:16:35.478 "5ca43782-1673-489b-9300-2ce7f967f245" 00:16:35.478 ], 00:16:35.478 "product_name": "Malloc disk", 00:16:35.478 "block_size": 512, 00:16:35.478 "num_blocks": 65536, 00:16:35.478 "uuid": "5ca43782-1673-489b-9300-2ce7f967f245", 00:16:35.478 "assigned_rate_limits": { 00:16:35.478 "rw_ios_per_sec": 0, 00:16:35.478 "rw_mbytes_per_sec": 0, 00:16:35.478 "r_mbytes_per_sec": 0, 00:16:35.479 "w_mbytes_per_sec": 0 00:16:35.479 }, 00:16:35.479 "claimed": true, 00:16:35.479 "claim_type": "exclusive_write", 00:16:35.479 "zoned": false, 00:16:35.479 "supported_io_types": { 00:16:35.479 "read": true, 00:16:35.479 "write": true, 00:16:35.479 "unmap": true, 00:16:35.479 "flush": true, 00:16:35.479 "reset": true, 00:16:35.479 "nvme_admin": false, 00:16:35.479 "nvme_io": false, 00:16:35.479 "nvme_io_md": false, 00:16:35.479 "write_zeroes": true, 00:16:35.479 "zcopy": true, 00:16:35.479 "get_zone_info": false, 00:16:35.479 "zone_management": false, 00:16:35.479 "zone_append": false, 00:16:35.479 "compare": false, 00:16:35.479 "compare_and_write": false, 00:16:35.479 "abort": true, 00:16:35.479 "seek_hole": false, 00:16:35.479 "seek_data": false, 00:16:35.479 "copy": true, 00:16:35.479 "nvme_iov_md": false 00:16:35.479 }, 00:16:35.479 "memory_domains": [ 00:16:35.479 { 00:16:35.479 "dma_device_id": "system", 00:16:35.479 "dma_device_type": 1 00:16:35.479 }, 00:16:35.479 { 00:16:35.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.479 "dma_device_type": 2 00:16:35.479 } 00:16:35.479 ], 00:16:35.479 "driver_specific": {} 00:16:35.479 } 00:16:35.479 ] 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.479 "name": "Existed_Raid", 00:16:35.479 "uuid": "d4685d26-2af0-4801-9b6e-42b75d101327", 00:16:35.479 "strip_size_kb": 64, 00:16:35.479 "state": "configuring", 00:16:35.479 "raid_level": "raid0", 00:16:35.479 "superblock": true, 00:16:35.479 "num_base_bdevs": 4, 00:16:35.479 "num_base_bdevs_discovered": 3, 00:16:35.479 "num_base_bdevs_operational": 4, 00:16:35.479 "base_bdevs_list": [ 00:16:35.479 { 00:16:35.479 "name": "BaseBdev1", 00:16:35.479 "uuid": "7c560616-a82c-4194-ae3e-cd524bd53ba6", 00:16:35.479 "is_configured": true, 00:16:35.479 "data_offset": 2048, 00:16:35.479 "data_size": 63488 00:16:35.479 }, 00:16:35.479 { 00:16:35.479 "name": "BaseBdev2", 00:16:35.479 "uuid": "04d4df6e-2dc3-49b9-8af7-2a0e36d7c9e6", 00:16:35.479 "is_configured": true, 00:16:35.479 "data_offset": 2048, 00:16:35.479 "data_size": 63488 00:16:35.479 }, 00:16:35.479 { 00:16:35.479 "name": "BaseBdev3", 00:16:35.479 "uuid": "5ca43782-1673-489b-9300-2ce7f967f245", 00:16:35.479 "is_configured": true, 00:16:35.479 "data_offset": 2048, 00:16:35.479 "data_size": 63488 00:16:35.479 }, 00:16:35.479 { 00:16:35.479 "name": "BaseBdev4", 00:16:35.479 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.479 "is_configured": false, 00:16:35.479 "data_offset": 0, 00:16:35.479 "data_size": 0 00:16:35.479 } 00:16:35.479 ] 00:16:35.479 }' 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.479 17:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:36.050 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:36.310 [2024-07-15 17:28:47.456489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:36.310 [2024-07-15 17:28:47.456614] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15a5fc0 00:16:36.310 [2024-07-15 17:28:47.456623] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:36.310 [2024-07-15 17:28:47.456778] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a5c00 00:16:36.310 [2024-07-15 17:28:47.456875] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15a5fc0 00:16:36.310 [2024-07-15 17:28:47.456881] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15a5fc0 00:16:36.310 [2024-07-15 17:28:47.456951] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:36.310 BaseBdev4 00:16:36.310 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:36.310 17:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:36.310 17:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:36.310 17:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:36.310 17:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:36.310 17:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:36.310 17:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:36.571 [ 00:16:36.571 { 00:16:36.571 "name": "BaseBdev4", 00:16:36.571 "aliases": [ 00:16:36.571 "f5b23342-a2e5-4a74-900e-1125c5d3c2d4" 00:16:36.571 ], 00:16:36.571 "product_name": "Malloc disk", 00:16:36.571 "block_size": 512, 00:16:36.571 "num_blocks": 65536, 00:16:36.571 "uuid": "f5b23342-a2e5-4a74-900e-1125c5d3c2d4", 00:16:36.571 "assigned_rate_limits": { 00:16:36.571 "rw_ios_per_sec": 0, 00:16:36.571 "rw_mbytes_per_sec": 0, 00:16:36.571 "r_mbytes_per_sec": 0, 00:16:36.571 "w_mbytes_per_sec": 0 00:16:36.571 }, 00:16:36.571 "claimed": true, 00:16:36.571 "claim_type": "exclusive_write", 00:16:36.571 "zoned": false, 00:16:36.571 "supported_io_types": { 00:16:36.571 "read": true, 00:16:36.571 "write": true, 00:16:36.571 "unmap": true, 00:16:36.571 "flush": true, 00:16:36.571 "reset": true, 00:16:36.571 "nvme_admin": false, 00:16:36.571 "nvme_io": false, 00:16:36.571 "nvme_io_md": false, 00:16:36.571 "write_zeroes": true, 00:16:36.571 "zcopy": true, 00:16:36.571 "get_zone_info": false, 00:16:36.571 "zone_management": false, 00:16:36.571 "zone_append": false, 00:16:36.571 "compare": false, 00:16:36.571 "compare_and_write": false, 00:16:36.571 "abort": true, 00:16:36.571 "seek_hole": false, 00:16:36.571 "seek_data": false, 00:16:36.571 "copy": true, 00:16:36.571 "nvme_iov_md": false 00:16:36.571 }, 00:16:36.571 "memory_domains": [ 00:16:36.571 { 00:16:36.571 "dma_device_id": "system", 00:16:36.571 "dma_device_type": 1 00:16:36.571 }, 00:16:36.571 { 00:16:36.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.571 "dma_device_type": 2 00:16:36.571 } 00:16:36.571 ], 00:16:36.571 "driver_specific": {} 00:16:36.571 } 00:16:36.571 ] 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.571 17:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.832 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.832 "name": "Existed_Raid", 00:16:36.832 "uuid": "d4685d26-2af0-4801-9b6e-42b75d101327", 00:16:36.832 "strip_size_kb": 64, 00:16:36.832 "state": "online", 00:16:36.832 "raid_level": "raid0", 00:16:36.832 "superblock": true, 00:16:36.832 "num_base_bdevs": 4, 00:16:36.832 "num_base_bdevs_discovered": 4, 00:16:36.832 "num_base_bdevs_operational": 4, 00:16:36.832 "base_bdevs_list": [ 00:16:36.832 { 00:16:36.832 "name": "BaseBdev1", 00:16:36.832 "uuid": "7c560616-a82c-4194-ae3e-cd524bd53ba6", 00:16:36.832 "is_configured": true, 00:16:36.832 "data_offset": 2048, 00:16:36.832 "data_size": 63488 00:16:36.832 }, 00:16:36.832 { 00:16:36.832 "name": "BaseBdev2", 00:16:36.832 "uuid": "04d4df6e-2dc3-49b9-8af7-2a0e36d7c9e6", 00:16:36.832 "is_configured": true, 00:16:36.832 "data_offset": 2048, 00:16:36.832 "data_size": 63488 00:16:36.832 }, 00:16:36.832 { 00:16:36.832 "name": "BaseBdev3", 00:16:36.832 "uuid": "5ca43782-1673-489b-9300-2ce7f967f245", 00:16:36.832 "is_configured": true, 00:16:36.832 "data_offset": 2048, 00:16:36.832 "data_size": 63488 00:16:36.832 }, 00:16:36.832 { 00:16:36.832 "name": "BaseBdev4", 00:16:36.832 "uuid": "f5b23342-a2e5-4a74-900e-1125c5d3c2d4", 00:16:36.832 "is_configured": true, 00:16:36.832 "data_offset": 2048, 00:16:36.832 "data_size": 63488 00:16:36.832 } 00:16:36.832 ] 00:16:36.832 }' 00:16:36.832 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.832 17:28:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:37.402 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:37.402 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:37.402 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:37.402 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:37.402 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:37.402 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:37.402 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:37.402 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:37.662 [2024-07-15 17:28:48.784095] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:37.662 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:37.662 "name": "Existed_Raid", 00:16:37.662 "aliases": [ 00:16:37.662 "d4685d26-2af0-4801-9b6e-42b75d101327" 00:16:37.662 ], 00:16:37.662 "product_name": "Raid Volume", 00:16:37.662 "block_size": 512, 00:16:37.662 "num_blocks": 253952, 00:16:37.662 "uuid": "d4685d26-2af0-4801-9b6e-42b75d101327", 00:16:37.662 "assigned_rate_limits": { 00:16:37.662 "rw_ios_per_sec": 0, 00:16:37.662 "rw_mbytes_per_sec": 0, 00:16:37.662 "r_mbytes_per_sec": 0, 00:16:37.662 "w_mbytes_per_sec": 0 00:16:37.662 }, 00:16:37.662 "claimed": false, 00:16:37.662 "zoned": false, 00:16:37.662 "supported_io_types": { 00:16:37.662 "read": true, 00:16:37.662 "write": true, 00:16:37.662 "unmap": true, 00:16:37.662 "flush": true, 00:16:37.662 "reset": true, 00:16:37.662 "nvme_admin": false, 00:16:37.662 "nvme_io": false, 00:16:37.662 "nvme_io_md": false, 00:16:37.662 "write_zeroes": true, 00:16:37.662 "zcopy": false, 00:16:37.662 "get_zone_info": false, 00:16:37.662 "zone_management": false, 00:16:37.662 "zone_append": false, 00:16:37.662 "compare": false, 00:16:37.662 "compare_and_write": false, 00:16:37.662 "abort": false, 00:16:37.662 "seek_hole": false, 00:16:37.662 "seek_data": false, 00:16:37.662 "copy": false, 00:16:37.662 "nvme_iov_md": false 00:16:37.662 }, 00:16:37.662 "memory_domains": [ 00:16:37.662 { 00:16:37.662 "dma_device_id": "system", 00:16:37.662 "dma_device_type": 1 00:16:37.662 }, 00:16:37.662 { 00:16:37.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.662 "dma_device_type": 2 00:16:37.662 }, 00:16:37.662 { 00:16:37.662 "dma_device_id": "system", 00:16:37.662 "dma_device_type": 1 00:16:37.662 }, 00:16:37.662 { 00:16:37.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.662 "dma_device_type": 2 00:16:37.662 }, 00:16:37.662 { 00:16:37.662 "dma_device_id": "system", 00:16:37.662 "dma_device_type": 1 00:16:37.662 }, 00:16:37.662 { 00:16:37.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.662 "dma_device_type": 2 00:16:37.662 }, 00:16:37.662 { 00:16:37.662 "dma_device_id": "system", 00:16:37.662 "dma_device_type": 1 00:16:37.662 }, 00:16:37.662 { 00:16:37.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.662 "dma_device_type": 2 00:16:37.662 } 00:16:37.662 ], 00:16:37.662 "driver_specific": { 00:16:37.662 "raid": { 00:16:37.662 "uuid": "d4685d26-2af0-4801-9b6e-42b75d101327", 00:16:37.662 "strip_size_kb": 64, 00:16:37.662 "state": "online", 00:16:37.662 "raid_level": "raid0", 00:16:37.662 "superblock": true, 00:16:37.662 "num_base_bdevs": 4, 00:16:37.662 "num_base_bdevs_discovered": 4, 00:16:37.662 "num_base_bdevs_operational": 4, 00:16:37.662 "base_bdevs_list": [ 00:16:37.662 { 00:16:37.662 "name": "BaseBdev1", 00:16:37.662 "uuid": "7c560616-a82c-4194-ae3e-cd524bd53ba6", 00:16:37.662 "is_configured": true, 00:16:37.662 "data_offset": 2048, 00:16:37.662 "data_size": 63488 00:16:37.662 }, 00:16:37.662 { 00:16:37.662 "name": "BaseBdev2", 00:16:37.662 "uuid": "04d4df6e-2dc3-49b9-8af7-2a0e36d7c9e6", 00:16:37.662 "is_configured": true, 00:16:37.662 "data_offset": 2048, 00:16:37.663 "data_size": 63488 00:16:37.663 }, 00:16:37.663 { 00:16:37.663 "name": "BaseBdev3", 00:16:37.663 "uuid": "5ca43782-1673-489b-9300-2ce7f967f245", 00:16:37.663 "is_configured": true, 00:16:37.663 "data_offset": 2048, 00:16:37.663 "data_size": 63488 00:16:37.663 }, 00:16:37.663 { 00:16:37.663 "name": "BaseBdev4", 00:16:37.663 "uuid": "f5b23342-a2e5-4a74-900e-1125c5d3c2d4", 00:16:37.663 "is_configured": true, 00:16:37.663 "data_offset": 2048, 00:16:37.663 "data_size": 63488 00:16:37.663 } 00:16:37.663 ] 00:16:37.663 } 00:16:37.663 } 00:16:37.663 }' 00:16:37.663 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:37.663 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:37.663 BaseBdev2 00:16:37.663 BaseBdev3 00:16:37.663 BaseBdev4' 00:16:37.663 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.663 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:37.663 17:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:37.923 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:37.923 "name": "BaseBdev1", 00:16:37.923 "aliases": [ 00:16:37.923 "7c560616-a82c-4194-ae3e-cd524bd53ba6" 00:16:37.923 ], 00:16:37.923 "product_name": "Malloc disk", 00:16:37.923 "block_size": 512, 00:16:37.923 "num_blocks": 65536, 00:16:37.923 "uuid": "7c560616-a82c-4194-ae3e-cd524bd53ba6", 00:16:37.923 "assigned_rate_limits": { 00:16:37.923 "rw_ios_per_sec": 0, 00:16:37.923 "rw_mbytes_per_sec": 0, 00:16:37.923 "r_mbytes_per_sec": 0, 00:16:37.923 "w_mbytes_per_sec": 0 00:16:37.923 }, 00:16:37.923 "claimed": true, 00:16:37.923 "claim_type": "exclusive_write", 00:16:37.923 "zoned": false, 00:16:37.923 "supported_io_types": { 00:16:37.923 "read": true, 00:16:37.923 "write": true, 00:16:37.923 "unmap": true, 00:16:37.923 "flush": true, 00:16:37.923 "reset": true, 00:16:37.923 "nvme_admin": false, 00:16:37.923 "nvme_io": false, 00:16:37.923 "nvme_io_md": false, 00:16:37.923 "write_zeroes": true, 00:16:37.923 "zcopy": true, 00:16:37.923 "get_zone_info": false, 00:16:37.923 "zone_management": false, 00:16:37.923 "zone_append": false, 00:16:37.923 "compare": false, 00:16:37.923 "compare_and_write": false, 00:16:37.923 "abort": true, 00:16:37.923 "seek_hole": false, 00:16:37.923 "seek_data": false, 00:16:37.923 "copy": true, 00:16:37.923 "nvme_iov_md": false 00:16:37.923 }, 00:16:37.923 "memory_domains": [ 00:16:37.923 { 00:16:37.923 "dma_device_id": "system", 00:16:37.923 "dma_device_type": 1 00:16:37.923 }, 00:16:37.923 { 00:16:37.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.923 "dma_device_type": 2 00:16:37.923 } 00:16:37.923 ], 00:16:37.923 "driver_specific": {} 00:16:37.923 }' 00:16:37.923 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.923 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.923 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.923 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.923 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.923 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.923 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.188 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.188 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.188 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.188 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.188 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:38.188 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:38.188 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:38.188 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.475 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.475 "name": "BaseBdev2", 00:16:38.475 "aliases": [ 00:16:38.475 "04d4df6e-2dc3-49b9-8af7-2a0e36d7c9e6" 00:16:38.475 ], 00:16:38.475 "product_name": "Malloc disk", 00:16:38.475 "block_size": 512, 00:16:38.475 "num_blocks": 65536, 00:16:38.475 "uuid": "04d4df6e-2dc3-49b9-8af7-2a0e36d7c9e6", 00:16:38.475 "assigned_rate_limits": { 00:16:38.475 "rw_ios_per_sec": 0, 00:16:38.475 "rw_mbytes_per_sec": 0, 00:16:38.475 "r_mbytes_per_sec": 0, 00:16:38.475 "w_mbytes_per_sec": 0 00:16:38.475 }, 00:16:38.475 "claimed": true, 00:16:38.475 "claim_type": "exclusive_write", 00:16:38.475 "zoned": false, 00:16:38.475 "supported_io_types": { 00:16:38.475 "read": true, 00:16:38.475 "write": true, 00:16:38.475 "unmap": true, 00:16:38.475 "flush": true, 00:16:38.475 "reset": true, 00:16:38.475 "nvme_admin": false, 00:16:38.475 "nvme_io": false, 00:16:38.475 "nvme_io_md": false, 00:16:38.475 "write_zeroes": true, 00:16:38.475 "zcopy": true, 00:16:38.475 "get_zone_info": false, 00:16:38.475 "zone_management": false, 00:16:38.475 "zone_append": false, 00:16:38.475 "compare": false, 00:16:38.475 "compare_and_write": false, 00:16:38.475 "abort": true, 00:16:38.475 "seek_hole": false, 00:16:38.475 "seek_data": false, 00:16:38.475 "copy": true, 00:16:38.475 "nvme_iov_md": false 00:16:38.475 }, 00:16:38.475 "memory_domains": [ 00:16:38.475 { 00:16:38.475 "dma_device_id": "system", 00:16:38.475 "dma_device_type": 1 00:16:38.475 }, 00:16:38.475 { 00:16:38.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.475 "dma_device_type": 2 00:16:38.475 } 00:16:38.476 ], 00:16:38.476 "driver_specific": {} 00:16:38.476 }' 00:16:38.476 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.476 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.476 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.476 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.476 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.476 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.476 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.476 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.736 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.736 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.736 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.736 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:38.736 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:38.736 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:38.736 17:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.996 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.996 "name": "BaseBdev3", 00:16:38.996 "aliases": [ 00:16:38.996 "5ca43782-1673-489b-9300-2ce7f967f245" 00:16:38.996 ], 00:16:38.996 "product_name": "Malloc disk", 00:16:38.996 "block_size": 512, 00:16:38.996 "num_blocks": 65536, 00:16:38.996 "uuid": "5ca43782-1673-489b-9300-2ce7f967f245", 00:16:38.996 "assigned_rate_limits": { 00:16:38.996 "rw_ios_per_sec": 0, 00:16:38.996 "rw_mbytes_per_sec": 0, 00:16:38.996 "r_mbytes_per_sec": 0, 00:16:38.996 "w_mbytes_per_sec": 0 00:16:38.996 }, 00:16:38.996 "claimed": true, 00:16:38.996 "claim_type": "exclusive_write", 00:16:38.996 "zoned": false, 00:16:38.996 "supported_io_types": { 00:16:38.996 "read": true, 00:16:38.996 "write": true, 00:16:38.996 "unmap": true, 00:16:38.996 "flush": true, 00:16:38.996 "reset": true, 00:16:38.996 "nvme_admin": false, 00:16:38.996 "nvme_io": false, 00:16:38.996 "nvme_io_md": false, 00:16:38.996 "write_zeroes": true, 00:16:38.996 "zcopy": true, 00:16:38.996 "get_zone_info": false, 00:16:38.996 "zone_management": false, 00:16:38.996 "zone_append": false, 00:16:38.996 "compare": false, 00:16:38.996 "compare_and_write": false, 00:16:38.996 "abort": true, 00:16:38.996 "seek_hole": false, 00:16:38.996 "seek_data": false, 00:16:38.996 "copy": true, 00:16:38.996 "nvme_iov_md": false 00:16:38.996 }, 00:16:38.996 "memory_domains": [ 00:16:38.996 { 00:16:38.996 "dma_device_id": "system", 00:16:38.996 "dma_device_type": 1 00:16:38.996 }, 00:16:38.996 { 00:16:38.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.996 "dma_device_type": 2 00:16:38.996 } 00:16:38.996 ], 00:16:38.996 "driver_specific": {} 00:16:38.996 }' 00:16:38.996 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.996 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.996 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.996 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.996 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.996 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.996 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.256 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.256 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.256 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.256 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.256 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.256 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:39.256 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:39.256 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:39.516 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:39.516 "name": "BaseBdev4", 00:16:39.516 "aliases": [ 00:16:39.516 "f5b23342-a2e5-4a74-900e-1125c5d3c2d4" 00:16:39.516 ], 00:16:39.516 "product_name": "Malloc disk", 00:16:39.516 "block_size": 512, 00:16:39.516 "num_blocks": 65536, 00:16:39.516 "uuid": "f5b23342-a2e5-4a74-900e-1125c5d3c2d4", 00:16:39.516 "assigned_rate_limits": { 00:16:39.516 "rw_ios_per_sec": 0, 00:16:39.516 "rw_mbytes_per_sec": 0, 00:16:39.516 "r_mbytes_per_sec": 0, 00:16:39.516 "w_mbytes_per_sec": 0 00:16:39.516 }, 00:16:39.516 "claimed": true, 00:16:39.516 "claim_type": "exclusive_write", 00:16:39.516 "zoned": false, 00:16:39.516 "supported_io_types": { 00:16:39.516 "read": true, 00:16:39.516 "write": true, 00:16:39.516 "unmap": true, 00:16:39.516 "flush": true, 00:16:39.516 "reset": true, 00:16:39.516 "nvme_admin": false, 00:16:39.516 "nvme_io": false, 00:16:39.516 "nvme_io_md": false, 00:16:39.516 "write_zeroes": true, 00:16:39.516 "zcopy": true, 00:16:39.516 "get_zone_info": false, 00:16:39.516 "zone_management": false, 00:16:39.516 "zone_append": false, 00:16:39.516 "compare": false, 00:16:39.516 "compare_and_write": false, 00:16:39.516 "abort": true, 00:16:39.516 "seek_hole": false, 00:16:39.516 "seek_data": false, 00:16:39.516 "copy": true, 00:16:39.516 "nvme_iov_md": false 00:16:39.516 }, 00:16:39.516 "memory_domains": [ 00:16:39.516 { 00:16:39.516 "dma_device_id": "system", 00:16:39.516 "dma_device_type": 1 00:16:39.516 }, 00:16:39.516 { 00:16:39.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.516 "dma_device_type": 2 00:16:39.516 } 00:16:39.516 ], 00:16:39.516 "driver_specific": {} 00:16:39.516 }' 00:16:39.516 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.516 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.516 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:39.516 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.516 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.777 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:39.777 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.777 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.777 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.777 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.777 17:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.777 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.777 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:40.037 [2024-07-15 17:28:51.210071] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:40.037 [2024-07-15 17:28:51.210088] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:40.037 [2024-07-15 17:28:51.210124] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.037 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.297 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.297 "name": "Existed_Raid", 00:16:40.297 "uuid": "d4685d26-2af0-4801-9b6e-42b75d101327", 00:16:40.297 "strip_size_kb": 64, 00:16:40.297 "state": "offline", 00:16:40.297 "raid_level": "raid0", 00:16:40.297 "superblock": true, 00:16:40.297 "num_base_bdevs": 4, 00:16:40.297 "num_base_bdevs_discovered": 3, 00:16:40.297 "num_base_bdevs_operational": 3, 00:16:40.297 "base_bdevs_list": [ 00:16:40.297 { 00:16:40.297 "name": null, 00:16:40.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.297 "is_configured": false, 00:16:40.297 "data_offset": 2048, 00:16:40.297 "data_size": 63488 00:16:40.297 }, 00:16:40.297 { 00:16:40.297 "name": "BaseBdev2", 00:16:40.297 "uuid": "04d4df6e-2dc3-49b9-8af7-2a0e36d7c9e6", 00:16:40.297 "is_configured": true, 00:16:40.297 "data_offset": 2048, 00:16:40.297 "data_size": 63488 00:16:40.297 }, 00:16:40.297 { 00:16:40.297 "name": "BaseBdev3", 00:16:40.297 "uuid": "5ca43782-1673-489b-9300-2ce7f967f245", 00:16:40.297 "is_configured": true, 00:16:40.297 "data_offset": 2048, 00:16:40.297 "data_size": 63488 00:16:40.297 }, 00:16:40.297 { 00:16:40.297 "name": "BaseBdev4", 00:16:40.297 "uuid": "f5b23342-a2e5-4a74-900e-1125c5d3c2d4", 00:16:40.297 "is_configured": true, 00:16:40.297 "data_offset": 2048, 00:16:40.297 "data_size": 63488 00:16:40.297 } 00:16:40.297 ] 00:16:40.297 }' 00:16:40.297 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.297 17:28:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:40.865 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:40.865 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:40.865 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.865 17:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:40.865 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:40.865 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:40.865 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:41.124 [2024-07-15 17:28:52.296824] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:41.124 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:41.124 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:41.124 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.124 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:41.383 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:41.383 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:41.383 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:41.642 [2024-07-15 17:28:52.683616] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:41.642 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:41.642 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:41.642 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.642 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:41.642 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:41.642 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:41.642 17:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:41.902 [2024-07-15 17:28:53.070485] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:41.902 [2024-07-15 17:28:53.070516] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a5fc0 name Existed_Raid, state offline 00:16:41.902 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:41.902 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:41.902 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.902 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:42.161 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:42.161 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:42.161 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:42.161 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:42.161 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:42.161 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:42.421 BaseBdev2 00:16:42.421 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:42.421 17:28:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:42.421 17:28:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:42.421 17:28:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:42.421 17:28:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:42.421 17:28:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:42.421 17:28:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.421 17:28:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:42.682 [ 00:16:42.682 { 00:16:42.682 "name": "BaseBdev2", 00:16:42.682 "aliases": [ 00:16:42.682 "2e40d956-286f-4ab1-b357-ffafc39aac76" 00:16:42.682 ], 00:16:42.682 "product_name": "Malloc disk", 00:16:42.682 "block_size": 512, 00:16:42.682 "num_blocks": 65536, 00:16:42.682 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:42.682 "assigned_rate_limits": { 00:16:42.682 "rw_ios_per_sec": 0, 00:16:42.682 "rw_mbytes_per_sec": 0, 00:16:42.682 "r_mbytes_per_sec": 0, 00:16:42.682 "w_mbytes_per_sec": 0 00:16:42.682 }, 00:16:42.682 "claimed": false, 00:16:42.682 "zoned": false, 00:16:42.682 "supported_io_types": { 00:16:42.682 "read": true, 00:16:42.682 "write": true, 00:16:42.682 "unmap": true, 00:16:42.682 "flush": true, 00:16:42.682 "reset": true, 00:16:42.682 "nvme_admin": false, 00:16:42.682 "nvme_io": false, 00:16:42.682 "nvme_io_md": false, 00:16:42.682 "write_zeroes": true, 00:16:42.682 "zcopy": true, 00:16:42.682 "get_zone_info": false, 00:16:42.682 "zone_management": false, 00:16:42.682 "zone_append": false, 00:16:42.682 "compare": false, 00:16:42.682 "compare_and_write": false, 00:16:42.682 "abort": true, 00:16:42.682 "seek_hole": false, 00:16:42.682 "seek_data": false, 00:16:42.682 "copy": true, 00:16:42.682 "nvme_iov_md": false 00:16:42.682 }, 00:16:42.682 "memory_domains": [ 00:16:42.682 { 00:16:42.682 "dma_device_id": "system", 00:16:42.682 "dma_device_type": 1 00:16:42.682 }, 00:16:42.682 { 00:16:42.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.682 "dma_device_type": 2 00:16:42.682 } 00:16:42.682 ], 00:16:42.682 "driver_specific": {} 00:16:42.682 } 00:16:42.682 ] 00:16:42.682 17:28:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:42.682 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:42.682 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:42.682 17:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:42.942 BaseBdev3 00:16:42.942 17:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:42.942 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:42.942 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:42.942 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:42.942 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:42.942 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:42.942 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.942 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:43.201 [ 00:16:43.201 { 00:16:43.201 "name": "BaseBdev3", 00:16:43.201 "aliases": [ 00:16:43.201 "9bb07532-3a49-4788-9b5e-c50a1ba542c6" 00:16:43.201 ], 00:16:43.201 "product_name": "Malloc disk", 00:16:43.201 "block_size": 512, 00:16:43.201 "num_blocks": 65536, 00:16:43.201 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:43.201 "assigned_rate_limits": { 00:16:43.201 "rw_ios_per_sec": 0, 00:16:43.201 "rw_mbytes_per_sec": 0, 00:16:43.201 "r_mbytes_per_sec": 0, 00:16:43.201 "w_mbytes_per_sec": 0 00:16:43.201 }, 00:16:43.201 "claimed": false, 00:16:43.201 "zoned": false, 00:16:43.201 "supported_io_types": { 00:16:43.201 "read": true, 00:16:43.201 "write": true, 00:16:43.201 "unmap": true, 00:16:43.201 "flush": true, 00:16:43.201 "reset": true, 00:16:43.201 "nvme_admin": false, 00:16:43.201 "nvme_io": false, 00:16:43.201 "nvme_io_md": false, 00:16:43.201 "write_zeroes": true, 00:16:43.201 "zcopy": true, 00:16:43.201 "get_zone_info": false, 00:16:43.201 "zone_management": false, 00:16:43.201 "zone_append": false, 00:16:43.201 "compare": false, 00:16:43.201 "compare_and_write": false, 00:16:43.201 "abort": true, 00:16:43.201 "seek_hole": false, 00:16:43.201 "seek_data": false, 00:16:43.201 "copy": true, 00:16:43.201 "nvme_iov_md": false 00:16:43.201 }, 00:16:43.201 "memory_domains": [ 00:16:43.201 { 00:16:43.201 "dma_device_id": "system", 00:16:43.201 "dma_device_type": 1 00:16:43.201 }, 00:16:43.201 { 00:16:43.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.201 "dma_device_type": 2 00:16:43.201 } 00:16:43.201 ], 00:16:43.201 "driver_specific": {} 00:16:43.201 } 00:16:43.201 ] 00:16:43.201 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:43.201 17:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:43.201 17:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:43.201 17:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:43.460 BaseBdev4 00:16:43.460 17:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:43.460 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:43.460 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:43.460 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:43.460 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:43.460 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:43.460 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:43.719 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:43.719 [ 00:16:43.719 { 00:16:43.719 "name": "BaseBdev4", 00:16:43.719 "aliases": [ 00:16:43.719 "dc0226aa-d931-4e39-94a6-9dd5916c6c87" 00:16:43.719 ], 00:16:43.719 "product_name": "Malloc disk", 00:16:43.719 "block_size": 512, 00:16:43.719 "num_blocks": 65536, 00:16:43.719 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:43.719 "assigned_rate_limits": { 00:16:43.719 "rw_ios_per_sec": 0, 00:16:43.719 "rw_mbytes_per_sec": 0, 00:16:43.719 "r_mbytes_per_sec": 0, 00:16:43.719 "w_mbytes_per_sec": 0 00:16:43.719 }, 00:16:43.719 "claimed": false, 00:16:43.719 "zoned": false, 00:16:43.719 "supported_io_types": { 00:16:43.719 "read": true, 00:16:43.719 "write": true, 00:16:43.719 "unmap": true, 00:16:43.719 "flush": true, 00:16:43.719 "reset": true, 00:16:43.719 "nvme_admin": false, 00:16:43.719 "nvme_io": false, 00:16:43.719 "nvme_io_md": false, 00:16:43.719 "write_zeroes": true, 00:16:43.719 "zcopy": true, 00:16:43.719 "get_zone_info": false, 00:16:43.719 "zone_management": false, 00:16:43.719 "zone_append": false, 00:16:43.719 "compare": false, 00:16:43.719 "compare_and_write": false, 00:16:43.719 "abort": true, 00:16:43.719 "seek_hole": false, 00:16:43.719 "seek_data": false, 00:16:43.719 "copy": true, 00:16:43.719 "nvme_iov_md": false 00:16:43.719 }, 00:16:43.719 "memory_domains": [ 00:16:43.719 { 00:16:43.719 "dma_device_id": "system", 00:16:43.719 "dma_device_type": 1 00:16:43.719 }, 00:16:43.719 { 00:16:43.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.719 "dma_device_type": 2 00:16:43.719 } 00:16:43.719 ], 00:16:43.719 "driver_specific": {} 00:16:43.719 } 00:16:43.719 ] 00:16:43.719 17:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:43.719 17:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:43.719 17:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:43.719 17:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:43.978 [2024-07-15 17:28:55.149850] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:43.978 [2024-07-15 17:28:55.149880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:43.978 [2024-07-15 17:28:55.149894] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:43.978 [2024-07-15 17:28:55.150941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:43.978 [2024-07-15 17:28:55.150974] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.978 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.238 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.238 "name": "Existed_Raid", 00:16:44.238 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:44.238 "strip_size_kb": 64, 00:16:44.238 "state": "configuring", 00:16:44.238 "raid_level": "raid0", 00:16:44.238 "superblock": true, 00:16:44.238 "num_base_bdevs": 4, 00:16:44.238 "num_base_bdevs_discovered": 3, 00:16:44.238 "num_base_bdevs_operational": 4, 00:16:44.238 "base_bdevs_list": [ 00:16:44.238 { 00:16:44.238 "name": "BaseBdev1", 00:16:44.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.238 "is_configured": false, 00:16:44.238 "data_offset": 0, 00:16:44.238 "data_size": 0 00:16:44.238 }, 00:16:44.238 { 00:16:44.238 "name": "BaseBdev2", 00:16:44.238 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:44.238 "is_configured": true, 00:16:44.238 "data_offset": 2048, 00:16:44.238 "data_size": 63488 00:16:44.238 }, 00:16:44.238 { 00:16:44.238 "name": "BaseBdev3", 00:16:44.238 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:44.238 "is_configured": true, 00:16:44.238 "data_offset": 2048, 00:16:44.238 "data_size": 63488 00:16:44.238 }, 00:16:44.238 { 00:16:44.238 "name": "BaseBdev4", 00:16:44.238 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:44.238 "is_configured": true, 00:16:44.238 "data_offset": 2048, 00:16:44.238 "data_size": 63488 00:16:44.238 } 00:16:44.238 ] 00:16:44.238 }' 00:16:44.238 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.238 17:28:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:44.808 17:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:44.808 [2024-07-15 17:28:56.056106] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.808 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.068 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.068 "name": "Existed_Raid", 00:16:45.068 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:45.068 "strip_size_kb": 64, 00:16:45.068 "state": "configuring", 00:16:45.068 "raid_level": "raid0", 00:16:45.068 "superblock": true, 00:16:45.068 "num_base_bdevs": 4, 00:16:45.068 "num_base_bdevs_discovered": 2, 00:16:45.068 "num_base_bdevs_operational": 4, 00:16:45.068 "base_bdevs_list": [ 00:16:45.068 { 00:16:45.068 "name": "BaseBdev1", 00:16:45.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.068 "is_configured": false, 00:16:45.068 "data_offset": 0, 00:16:45.068 "data_size": 0 00:16:45.068 }, 00:16:45.068 { 00:16:45.068 "name": null, 00:16:45.068 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:45.068 "is_configured": false, 00:16:45.068 "data_offset": 2048, 00:16:45.068 "data_size": 63488 00:16:45.068 }, 00:16:45.068 { 00:16:45.068 "name": "BaseBdev3", 00:16:45.068 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:45.068 "is_configured": true, 00:16:45.068 "data_offset": 2048, 00:16:45.068 "data_size": 63488 00:16:45.068 }, 00:16:45.068 { 00:16:45.068 "name": "BaseBdev4", 00:16:45.068 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:45.068 "is_configured": true, 00:16:45.068 "data_offset": 2048, 00:16:45.068 "data_size": 63488 00:16:45.068 } 00:16:45.068 ] 00:16:45.068 }' 00:16:45.068 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.068 17:28:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.636 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.636 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:45.895 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:45.895 17:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:45.895 [2024-07-15 17:28:57.151873] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:45.895 BaseBdev1 00:16:45.895 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:45.895 17:28:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:45.895 17:28:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:45.895 17:28:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:45.895 17:28:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:45.895 17:28:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:45.895 17:28:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:46.154 17:28:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:46.414 [ 00:16:46.415 { 00:16:46.415 "name": "BaseBdev1", 00:16:46.415 "aliases": [ 00:16:46.415 "fb161016-4668-4559-8643-90f7caa9560d" 00:16:46.415 ], 00:16:46.415 "product_name": "Malloc disk", 00:16:46.415 "block_size": 512, 00:16:46.415 "num_blocks": 65536, 00:16:46.415 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:46.415 "assigned_rate_limits": { 00:16:46.415 "rw_ios_per_sec": 0, 00:16:46.415 "rw_mbytes_per_sec": 0, 00:16:46.415 "r_mbytes_per_sec": 0, 00:16:46.415 "w_mbytes_per_sec": 0 00:16:46.415 }, 00:16:46.415 "claimed": true, 00:16:46.415 "claim_type": "exclusive_write", 00:16:46.415 "zoned": false, 00:16:46.415 "supported_io_types": { 00:16:46.415 "read": true, 00:16:46.415 "write": true, 00:16:46.415 "unmap": true, 00:16:46.415 "flush": true, 00:16:46.415 "reset": true, 00:16:46.415 "nvme_admin": false, 00:16:46.415 "nvme_io": false, 00:16:46.415 "nvme_io_md": false, 00:16:46.415 "write_zeroes": true, 00:16:46.415 "zcopy": true, 00:16:46.415 "get_zone_info": false, 00:16:46.415 "zone_management": false, 00:16:46.415 "zone_append": false, 00:16:46.415 "compare": false, 00:16:46.415 "compare_and_write": false, 00:16:46.415 "abort": true, 00:16:46.415 "seek_hole": false, 00:16:46.415 "seek_data": false, 00:16:46.415 "copy": true, 00:16:46.415 "nvme_iov_md": false 00:16:46.415 }, 00:16:46.415 "memory_domains": [ 00:16:46.415 { 00:16:46.415 "dma_device_id": "system", 00:16:46.415 "dma_device_type": 1 00:16:46.415 }, 00:16:46.415 { 00:16:46.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:46.415 "dma_device_type": 2 00:16:46.415 } 00:16:46.415 ], 00:16:46.415 "driver_specific": {} 00:16:46.415 } 00:16:46.415 ] 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.415 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.674 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.674 "name": "Existed_Raid", 00:16:46.674 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:46.674 "strip_size_kb": 64, 00:16:46.674 "state": "configuring", 00:16:46.674 "raid_level": "raid0", 00:16:46.674 "superblock": true, 00:16:46.674 "num_base_bdevs": 4, 00:16:46.674 "num_base_bdevs_discovered": 3, 00:16:46.674 "num_base_bdevs_operational": 4, 00:16:46.674 "base_bdevs_list": [ 00:16:46.674 { 00:16:46.674 "name": "BaseBdev1", 00:16:46.674 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:46.674 "is_configured": true, 00:16:46.674 "data_offset": 2048, 00:16:46.674 "data_size": 63488 00:16:46.674 }, 00:16:46.674 { 00:16:46.674 "name": null, 00:16:46.674 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:46.674 "is_configured": false, 00:16:46.674 "data_offset": 2048, 00:16:46.674 "data_size": 63488 00:16:46.674 }, 00:16:46.674 { 00:16:46.674 "name": "BaseBdev3", 00:16:46.674 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:46.674 "is_configured": true, 00:16:46.674 "data_offset": 2048, 00:16:46.674 "data_size": 63488 00:16:46.674 }, 00:16:46.674 { 00:16:46.674 "name": "BaseBdev4", 00:16:46.674 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:46.674 "is_configured": true, 00:16:46.674 "data_offset": 2048, 00:16:46.674 "data_size": 63488 00:16:46.674 } 00:16:46.674 ] 00:16:46.674 }' 00:16:46.674 17:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.674 17:28:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.244 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.244 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:47.244 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:47.244 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:47.504 [2024-07-15 17:28:58.595550] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.504 "name": "Existed_Raid", 00:16:47.504 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:47.504 "strip_size_kb": 64, 00:16:47.504 "state": "configuring", 00:16:47.504 "raid_level": "raid0", 00:16:47.504 "superblock": true, 00:16:47.504 "num_base_bdevs": 4, 00:16:47.504 "num_base_bdevs_discovered": 2, 00:16:47.504 "num_base_bdevs_operational": 4, 00:16:47.504 "base_bdevs_list": [ 00:16:47.504 { 00:16:47.504 "name": "BaseBdev1", 00:16:47.504 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:47.504 "is_configured": true, 00:16:47.504 "data_offset": 2048, 00:16:47.504 "data_size": 63488 00:16:47.504 }, 00:16:47.504 { 00:16:47.504 "name": null, 00:16:47.504 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:47.504 "is_configured": false, 00:16:47.504 "data_offset": 2048, 00:16:47.504 "data_size": 63488 00:16:47.504 }, 00:16:47.504 { 00:16:47.504 "name": null, 00:16:47.504 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:47.504 "is_configured": false, 00:16:47.504 "data_offset": 2048, 00:16:47.504 "data_size": 63488 00:16:47.504 }, 00:16:47.504 { 00:16:47.504 "name": "BaseBdev4", 00:16:47.504 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:47.504 "is_configured": true, 00:16:47.504 "data_offset": 2048, 00:16:47.504 "data_size": 63488 00:16:47.504 } 00:16:47.504 ] 00:16:47.504 }' 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.504 17:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:48.076 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.076 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:48.336 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:48.336 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:48.595 [2024-07-15 17:28:59.686319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.595 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.854 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.854 "name": "Existed_Raid", 00:16:48.854 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:48.854 "strip_size_kb": 64, 00:16:48.854 "state": "configuring", 00:16:48.854 "raid_level": "raid0", 00:16:48.854 "superblock": true, 00:16:48.854 "num_base_bdevs": 4, 00:16:48.854 "num_base_bdevs_discovered": 3, 00:16:48.854 "num_base_bdevs_operational": 4, 00:16:48.854 "base_bdevs_list": [ 00:16:48.854 { 00:16:48.854 "name": "BaseBdev1", 00:16:48.854 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:48.854 "is_configured": true, 00:16:48.854 "data_offset": 2048, 00:16:48.854 "data_size": 63488 00:16:48.854 }, 00:16:48.854 { 00:16:48.854 "name": null, 00:16:48.854 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:48.854 "is_configured": false, 00:16:48.854 "data_offset": 2048, 00:16:48.854 "data_size": 63488 00:16:48.854 }, 00:16:48.854 { 00:16:48.854 "name": "BaseBdev3", 00:16:48.854 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:48.854 "is_configured": true, 00:16:48.854 "data_offset": 2048, 00:16:48.854 "data_size": 63488 00:16:48.854 }, 00:16:48.854 { 00:16:48.854 "name": "BaseBdev4", 00:16:48.854 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:48.854 "is_configured": true, 00:16:48.855 "data_offset": 2048, 00:16:48.855 "data_size": 63488 00:16:48.855 } 00:16:48.855 ] 00:16:48.855 }' 00:16:48.855 17:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.855 17:28:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:49.113 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:49.113 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.373 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:49.373 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:49.633 [2024-07-15 17:29:00.761475] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.633 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.893 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.893 "name": "Existed_Raid", 00:16:49.893 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:49.893 "strip_size_kb": 64, 00:16:49.893 "state": "configuring", 00:16:49.893 "raid_level": "raid0", 00:16:49.893 "superblock": true, 00:16:49.893 "num_base_bdevs": 4, 00:16:49.893 "num_base_bdevs_discovered": 2, 00:16:49.893 "num_base_bdevs_operational": 4, 00:16:49.893 "base_bdevs_list": [ 00:16:49.893 { 00:16:49.893 "name": null, 00:16:49.893 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:49.893 "is_configured": false, 00:16:49.893 "data_offset": 2048, 00:16:49.893 "data_size": 63488 00:16:49.893 }, 00:16:49.893 { 00:16:49.893 "name": null, 00:16:49.893 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:49.893 "is_configured": false, 00:16:49.893 "data_offset": 2048, 00:16:49.893 "data_size": 63488 00:16:49.893 }, 00:16:49.893 { 00:16:49.893 "name": "BaseBdev3", 00:16:49.893 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:49.893 "is_configured": true, 00:16:49.893 "data_offset": 2048, 00:16:49.893 "data_size": 63488 00:16:49.893 }, 00:16:49.893 { 00:16:49.893 "name": "BaseBdev4", 00:16:49.893 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:49.893 "is_configured": true, 00:16:49.893 "data_offset": 2048, 00:16:49.893 "data_size": 63488 00:16:49.893 } 00:16:49.893 ] 00:16:49.893 }' 00:16:49.893 17:29:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.893 17:29:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:50.462 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.462 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:50.462 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:50.462 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:50.721 [2024-07-15 17:29:01.862053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.721 17:29:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.981 17:29:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.981 "name": "Existed_Raid", 00:16:50.981 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:50.981 "strip_size_kb": 64, 00:16:50.981 "state": "configuring", 00:16:50.981 "raid_level": "raid0", 00:16:50.981 "superblock": true, 00:16:50.981 "num_base_bdevs": 4, 00:16:50.981 "num_base_bdevs_discovered": 3, 00:16:50.981 "num_base_bdevs_operational": 4, 00:16:50.981 "base_bdevs_list": [ 00:16:50.981 { 00:16:50.981 "name": null, 00:16:50.981 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:50.981 "is_configured": false, 00:16:50.981 "data_offset": 2048, 00:16:50.981 "data_size": 63488 00:16:50.981 }, 00:16:50.981 { 00:16:50.981 "name": "BaseBdev2", 00:16:50.981 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:50.981 "is_configured": true, 00:16:50.981 "data_offset": 2048, 00:16:50.981 "data_size": 63488 00:16:50.981 }, 00:16:50.981 { 00:16:50.981 "name": "BaseBdev3", 00:16:50.981 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:50.981 "is_configured": true, 00:16:50.981 "data_offset": 2048, 00:16:50.981 "data_size": 63488 00:16:50.981 }, 00:16:50.981 { 00:16:50.981 "name": "BaseBdev4", 00:16:50.981 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:50.981 "is_configured": true, 00:16:50.981 "data_offset": 2048, 00:16:50.981 "data_size": 63488 00:16:50.981 } 00:16:50.981 ] 00:16:50.981 }' 00:16:50.981 17:29:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.981 17:29:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:51.550 17:29:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.550 17:29:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:51.550 17:29:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:51.550 17:29:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.550 17:29:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:51.810 17:29:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u fb161016-4668-4559-8643-90f7caa9560d 00:16:52.071 [2024-07-15 17:29:03.134258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:52.071 [2024-07-15 17:29:03.134377] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15a9c20 00:16:52.071 [2024-07-15 17:29:03.134385] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:52.071 [2024-07-15 17:29:03.134531] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x159d740 00:16:52.071 [2024-07-15 17:29:03.134619] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15a9c20 00:16:52.071 [2024-07-15 17:29:03.134624] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15a9c20 00:16:52.071 [2024-07-15 17:29:03.134691] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:52.071 NewBaseBdev 00:16:52.071 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:52.071 17:29:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:52.071 17:29:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:52.071 17:29:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:52.071 17:29:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:52.071 17:29:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:52.071 17:29:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:52.071 17:29:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:52.330 [ 00:16:52.330 { 00:16:52.330 "name": "NewBaseBdev", 00:16:52.330 "aliases": [ 00:16:52.330 "fb161016-4668-4559-8643-90f7caa9560d" 00:16:52.330 ], 00:16:52.330 "product_name": "Malloc disk", 00:16:52.330 "block_size": 512, 00:16:52.330 "num_blocks": 65536, 00:16:52.330 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:52.330 "assigned_rate_limits": { 00:16:52.330 "rw_ios_per_sec": 0, 00:16:52.330 "rw_mbytes_per_sec": 0, 00:16:52.330 "r_mbytes_per_sec": 0, 00:16:52.330 "w_mbytes_per_sec": 0 00:16:52.330 }, 00:16:52.330 "claimed": true, 00:16:52.330 "claim_type": "exclusive_write", 00:16:52.330 "zoned": false, 00:16:52.330 "supported_io_types": { 00:16:52.330 "read": true, 00:16:52.330 "write": true, 00:16:52.330 "unmap": true, 00:16:52.330 "flush": true, 00:16:52.330 "reset": true, 00:16:52.330 "nvme_admin": false, 00:16:52.330 "nvme_io": false, 00:16:52.330 "nvme_io_md": false, 00:16:52.330 "write_zeroes": true, 00:16:52.330 "zcopy": true, 00:16:52.330 "get_zone_info": false, 00:16:52.330 "zone_management": false, 00:16:52.330 "zone_append": false, 00:16:52.330 "compare": false, 00:16:52.330 "compare_and_write": false, 00:16:52.330 "abort": true, 00:16:52.330 "seek_hole": false, 00:16:52.330 "seek_data": false, 00:16:52.330 "copy": true, 00:16:52.330 "nvme_iov_md": false 00:16:52.330 }, 00:16:52.330 "memory_domains": [ 00:16:52.330 { 00:16:52.330 "dma_device_id": "system", 00:16:52.330 "dma_device_type": 1 00:16:52.330 }, 00:16:52.330 { 00:16:52.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.330 "dma_device_type": 2 00:16:52.330 } 00:16:52.330 ], 00:16:52.330 "driver_specific": {} 00:16:52.330 } 00:16:52.330 ] 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.330 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.590 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.590 "name": "Existed_Raid", 00:16:52.590 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:52.590 "strip_size_kb": 64, 00:16:52.590 "state": "online", 00:16:52.590 "raid_level": "raid0", 00:16:52.590 "superblock": true, 00:16:52.590 "num_base_bdevs": 4, 00:16:52.590 "num_base_bdevs_discovered": 4, 00:16:52.590 "num_base_bdevs_operational": 4, 00:16:52.590 "base_bdevs_list": [ 00:16:52.590 { 00:16:52.590 "name": "NewBaseBdev", 00:16:52.590 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:52.590 "is_configured": true, 00:16:52.590 "data_offset": 2048, 00:16:52.590 "data_size": 63488 00:16:52.590 }, 00:16:52.590 { 00:16:52.590 "name": "BaseBdev2", 00:16:52.590 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:52.590 "is_configured": true, 00:16:52.590 "data_offset": 2048, 00:16:52.590 "data_size": 63488 00:16:52.590 }, 00:16:52.590 { 00:16:52.590 "name": "BaseBdev3", 00:16:52.590 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:52.590 "is_configured": true, 00:16:52.590 "data_offset": 2048, 00:16:52.590 "data_size": 63488 00:16:52.590 }, 00:16:52.590 { 00:16:52.590 "name": "BaseBdev4", 00:16:52.590 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:52.590 "is_configured": true, 00:16:52.590 "data_offset": 2048, 00:16:52.590 "data_size": 63488 00:16:52.590 } 00:16:52.590 ] 00:16:52.590 }' 00:16:52.590 17:29:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.590 17:29:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:53.160 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:53.160 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:53.160 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:53.160 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:53.160 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:53.160 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:53.160 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:53.160 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:53.160 [2024-07-15 17:29:04.449856] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:53.420 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:53.420 "name": "Existed_Raid", 00:16:53.420 "aliases": [ 00:16:53.420 "d5f10190-e8d5-40c1-b2aa-18f978f3063c" 00:16:53.420 ], 00:16:53.420 "product_name": "Raid Volume", 00:16:53.420 "block_size": 512, 00:16:53.420 "num_blocks": 253952, 00:16:53.420 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:53.420 "assigned_rate_limits": { 00:16:53.420 "rw_ios_per_sec": 0, 00:16:53.420 "rw_mbytes_per_sec": 0, 00:16:53.420 "r_mbytes_per_sec": 0, 00:16:53.420 "w_mbytes_per_sec": 0 00:16:53.420 }, 00:16:53.420 "claimed": false, 00:16:53.420 "zoned": false, 00:16:53.420 "supported_io_types": { 00:16:53.420 "read": true, 00:16:53.420 "write": true, 00:16:53.420 "unmap": true, 00:16:53.420 "flush": true, 00:16:53.421 "reset": true, 00:16:53.421 "nvme_admin": false, 00:16:53.421 "nvme_io": false, 00:16:53.421 "nvme_io_md": false, 00:16:53.421 "write_zeroes": true, 00:16:53.421 "zcopy": false, 00:16:53.421 "get_zone_info": false, 00:16:53.421 "zone_management": false, 00:16:53.421 "zone_append": false, 00:16:53.421 "compare": false, 00:16:53.421 "compare_and_write": false, 00:16:53.421 "abort": false, 00:16:53.421 "seek_hole": false, 00:16:53.421 "seek_data": false, 00:16:53.421 "copy": false, 00:16:53.421 "nvme_iov_md": false 00:16:53.421 }, 00:16:53.421 "memory_domains": [ 00:16:53.421 { 00:16:53.421 "dma_device_id": "system", 00:16:53.421 "dma_device_type": 1 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.421 "dma_device_type": 2 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "dma_device_id": "system", 00:16:53.421 "dma_device_type": 1 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.421 "dma_device_type": 2 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "dma_device_id": "system", 00:16:53.421 "dma_device_type": 1 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.421 "dma_device_type": 2 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "dma_device_id": "system", 00:16:53.421 "dma_device_type": 1 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.421 "dma_device_type": 2 00:16:53.421 } 00:16:53.421 ], 00:16:53.421 "driver_specific": { 00:16:53.421 "raid": { 00:16:53.421 "uuid": "d5f10190-e8d5-40c1-b2aa-18f978f3063c", 00:16:53.421 "strip_size_kb": 64, 00:16:53.421 "state": "online", 00:16:53.421 "raid_level": "raid0", 00:16:53.421 "superblock": true, 00:16:53.421 "num_base_bdevs": 4, 00:16:53.421 "num_base_bdevs_discovered": 4, 00:16:53.421 "num_base_bdevs_operational": 4, 00:16:53.421 "base_bdevs_list": [ 00:16:53.421 { 00:16:53.421 "name": "NewBaseBdev", 00:16:53.421 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:53.421 "is_configured": true, 00:16:53.421 "data_offset": 2048, 00:16:53.421 "data_size": 63488 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "name": "BaseBdev2", 00:16:53.421 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:53.421 "is_configured": true, 00:16:53.421 "data_offset": 2048, 00:16:53.421 "data_size": 63488 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "name": "BaseBdev3", 00:16:53.421 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:53.421 "is_configured": true, 00:16:53.421 "data_offset": 2048, 00:16:53.421 "data_size": 63488 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "name": "BaseBdev4", 00:16:53.421 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:53.421 "is_configured": true, 00:16:53.421 "data_offset": 2048, 00:16:53.421 "data_size": 63488 00:16:53.421 } 00:16:53.421 ] 00:16:53.421 } 00:16:53.421 } 00:16:53.421 }' 00:16:53.421 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:53.421 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:53.421 BaseBdev2 00:16:53.421 BaseBdev3 00:16:53.421 BaseBdev4' 00:16:53.421 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.421 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:53.421 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.421 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.421 "name": "NewBaseBdev", 00:16:53.421 "aliases": [ 00:16:53.421 "fb161016-4668-4559-8643-90f7caa9560d" 00:16:53.421 ], 00:16:53.421 "product_name": "Malloc disk", 00:16:53.421 "block_size": 512, 00:16:53.421 "num_blocks": 65536, 00:16:53.421 "uuid": "fb161016-4668-4559-8643-90f7caa9560d", 00:16:53.421 "assigned_rate_limits": { 00:16:53.421 "rw_ios_per_sec": 0, 00:16:53.421 "rw_mbytes_per_sec": 0, 00:16:53.421 "r_mbytes_per_sec": 0, 00:16:53.421 "w_mbytes_per_sec": 0 00:16:53.421 }, 00:16:53.421 "claimed": true, 00:16:53.421 "claim_type": "exclusive_write", 00:16:53.421 "zoned": false, 00:16:53.421 "supported_io_types": { 00:16:53.421 "read": true, 00:16:53.421 "write": true, 00:16:53.421 "unmap": true, 00:16:53.421 "flush": true, 00:16:53.421 "reset": true, 00:16:53.421 "nvme_admin": false, 00:16:53.421 "nvme_io": false, 00:16:53.421 "nvme_io_md": false, 00:16:53.421 "write_zeroes": true, 00:16:53.421 "zcopy": true, 00:16:53.421 "get_zone_info": false, 00:16:53.421 "zone_management": false, 00:16:53.421 "zone_append": false, 00:16:53.421 "compare": false, 00:16:53.421 "compare_and_write": false, 00:16:53.421 "abort": true, 00:16:53.421 "seek_hole": false, 00:16:53.421 "seek_data": false, 00:16:53.421 "copy": true, 00:16:53.421 "nvme_iov_md": false 00:16:53.421 }, 00:16:53.421 "memory_domains": [ 00:16:53.421 { 00:16:53.421 "dma_device_id": "system", 00:16:53.421 "dma_device_type": 1 00:16:53.421 }, 00:16:53.421 { 00:16:53.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.421 "dma_device_type": 2 00:16:53.421 } 00:16:53.421 ], 00:16:53.421 "driver_specific": {} 00:16:53.421 }' 00:16:53.421 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.682 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.682 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.682 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.682 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.682 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.682 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.682 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.682 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.682 17:29:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.943 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.943 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.943 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.943 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:53.943 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.204 "name": "BaseBdev2", 00:16:54.204 "aliases": [ 00:16:54.204 "2e40d956-286f-4ab1-b357-ffafc39aac76" 00:16:54.204 ], 00:16:54.204 "product_name": "Malloc disk", 00:16:54.204 "block_size": 512, 00:16:54.204 "num_blocks": 65536, 00:16:54.204 "uuid": "2e40d956-286f-4ab1-b357-ffafc39aac76", 00:16:54.204 "assigned_rate_limits": { 00:16:54.204 "rw_ios_per_sec": 0, 00:16:54.204 "rw_mbytes_per_sec": 0, 00:16:54.204 "r_mbytes_per_sec": 0, 00:16:54.204 "w_mbytes_per_sec": 0 00:16:54.204 }, 00:16:54.204 "claimed": true, 00:16:54.204 "claim_type": "exclusive_write", 00:16:54.204 "zoned": false, 00:16:54.204 "supported_io_types": { 00:16:54.204 "read": true, 00:16:54.204 "write": true, 00:16:54.204 "unmap": true, 00:16:54.204 "flush": true, 00:16:54.204 "reset": true, 00:16:54.204 "nvme_admin": false, 00:16:54.204 "nvme_io": false, 00:16:54.204 "nvme_io_md": false, 00:16:54.204 "write_zeroes": true, 00:16:54.204 "zcopy": true, 00:16:54.204 "get_zone_info": false, 00:16:54.204 "zone_management": false, 00:16:54.204 "zone_append": false, 00:16:54.204 "compare": false, 00:16:54.204 "compare_and_write": false, 00:16:54.204 "abort": true, 00:16:54.204 "seek_hole": false, 00:16:54.204 "seek_data": false, 00:16:54.204 "copy": true, 00:16:54.204 "nvme_iov_md": false 00:16:54.204 }, 00:16:54.204 "memory_domains": [ 00:16:54.204 { 00:16:54.204 "dma_device_id": "system", 00:16:54.204 "dma_device_type": 1 00:16:54.204 }, 00:16:54.204 { 00:16:54.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.204 "dma_device_type": 2 00:16:54.204 } 00:16:54.204 ], 00:16:54.204 "driver_specific": {} 00:16:54.204 }' 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.204 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.464 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.464 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.464 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.464 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.464 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.464 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:54.758 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.758 "name": "BaseBdev3", 00:16:54.758 "aliases": [ 00:16:54.758 "9bb07532-3a49-4788-9b5e-c50a1ba542c6" 00:16:54.758 ], 00:16:54.758 "product_name": "Malloc disk", 00:16:54.758 "block_size": 512, 00:16:54.758 "num_blocks": 65536, 00:16:54.758 "uuid": "9bb07532-3a49-4788-9b5e-c50a1ba542c6", 00:16:54.758 "assigned_rate_limits": { 00:16:54.758 "rw_ios_per_sec": 0, 00:16:54.758 "rw_mbytes_per_sec": 0, 00:16:54.758 "r_mbytes_per_sec": 0, 00:16:54.758 "w_mbytes_per_sec": 0 00:16:54.758 }, 00:16:54.758 "claimed": true, 00:16:54.758 "claim_type": "exclusive_write", 00:16:54.758 "zoned": false, 00:16:54.758 "supported_io_types": { 00:16:54.758 "read": true, 00:16:54.758 "write": true, 00:16:54.758 "unmap": true, 00:16:54.758 "flush": true, 00:16:54.758 "reset": true, 00:16:54.758 "nvme_admin": false, 00:16:54.758 "nvme_io": false, 00:16:54.758 "nvme_io_md": false, 00:16:54.758 "write_zeroes": true, 00:16:54.758 "zcopy": true, 00:16:54.758 "get_zone_info": false, 00:16:54.758 "zone_management": false, 00:16:54.758 "zone_append": false, 00:16:54.758 "compare": false, 00:16:54.758 "compare_and_write": false, 00:16:54.758 "abort": true, 00:16:54.758 "seek_hole": false, 00:16:54.758 "seek_data": false, 00:16:54.758 "copy": true, 00:16:54.758 "nvme_iov_md": false 00:16:54.758 }, 00:16:54.758 "memory_domains": [ 00:16:54.758 { 00:16:54.758 "dma_device_id": "system", 00:16:54.758 "dma_device_type": 1 00:16:54.758 }, 00:16:54.758 { 00:16:54.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.758 "dma_device_type": 2 00:16:54.758 } 00:16:54.758 ], 00:16:54.758 "driver_specific": {} 00:16:54.758 }' 00:16:54.758 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.758 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.758 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.758 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.758 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.758 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.758 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.758 17:29:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.758 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.758 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.023 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.023 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.023 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.023 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:55.023 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:55.023 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:55.023 "name": "BaseBdev4", 00:16:55.023 "aliases": [ 00:16:55.023 "dc0226aa-d931-4e39-94a6-9dd5916c6c87" 00:16:55.023 ], 00:16:55.023 "product_name": "Malloc disk", 00:16:55.023 "block_size": 512, 00:16:55.023 "num_blocks": 65536, 00:16:55.023 "uuid": "dc0226aa-d931-4e39-94a6-9dd5916c6c87", 00:16:55.023 "assigned_rate_limits": { 00:16:55.023 "rw_ios_per_sec": 0, 00:16:55.023 "rw_mbytes_per_sec": 0, 00:16:55.023 "r_mbytes_per_sec": 0, 00:16:55.023 "w_mbytes_per_sec": 0 00:16:55.023 }, 00:16:55.023 "claimed": true, 00:16:55.023 "claim_type": "exclusive_write", 00:16:55.023 "zoned": false, 00:16:55.023 "supported_io_types": { 00:16:55.023 "read": true, 00:16:55.023 "write": true, 00:16:55.023 "unmap": true, 00:16:55.023 "flush": true, 00:16:55.023 "reset": true, 00:16:55.023 "nvme_admin": false, 00:16:55.023 "nvme_io": false, 00:16:55.023 "nvme_io_md": false, 00:16:55.023 "write_zeroes": true, 00:16:55.023 "zcopy": true, 00:16:55.023 "get_zone_info": false, 00:16:55.023 "zone_management": false, 00:16:55.023 "zone_append": false, 00:16:55.023 "compare": false, 00:16:55.023 "compare_and_write": false, 00:16:55.023 "abort": true, 00:16:55.023 "seek_hole": false, 00:16:55.023 "seek_data": false, 00:16:55.023 "copy": true, 00:16:55.023 "nvme_iov_md": false 00:16:55.023 }, 00:16:55.023 "memory_domains": [ 00:16:55.023 { 00:16:55.023 "dma_device_id": "system", 00:16:55.023 "dma_device_type": 1 00:16:55.023 }, 00:16:55.023 { 00:16:55.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.024 "dma_device_type": 2 00:16:55.024 } 00:16:55.024 ], 00:16:55.024 "driver_specific": {} 00:16:55.024 }' 00:16:55.024 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.284 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.284 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.284 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.284 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.284 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.284 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.284 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.545 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.545 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.545 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.545 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.545 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:55.806 [2024-07-15 17:29:06.859689] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:55.806 [2024-07-15 17:29:06.859714] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:55.806 [2024-07-15 17:29:06.859756] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:55.806 [2024-07-15 17:29:06.859799] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:55.806 [2024-07-15 17:29:06.859805] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a9c20 name Existed_Raid, state offline 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2812334 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2812334 ']' 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2812334 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2812334 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2812334' 00:16:55.806 killing process with pid 2812334 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2812334 00:16:55.806 [2024-07-15 17:29:06.924636] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:55.806 17:29:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2812334 00:16:55.806 [2024-07-15 17:29:06.945280] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:55.806 17:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:55.806 00:16:55.806 real 0m27.471s 00:16:55.806 user 0m51.607s 00:16:55.806 sys 0m3.955s 00:16:55.806 17:29:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:55.806 17:29:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:55.806 ************************************ 00:16:55.806 END TEST raid_state_function_test_sb 00:16:55.806 ************************************ 00:16:56.066 17:29:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:56.066 17:29:07 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:16:56.066 17:29:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:56.066 17:29:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:56.066 17:29:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:56.066 ************************************ 00:16:56.066 START TEST raid_superblock_test 00:16:56.066 ************************************ 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2817586 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2817586 /var/tmp/spdk-raid.sock 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2817586 ']' 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:56.066 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:56.066 17:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:56.066 [2024-07-15 17:29:07.193984] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:16:56.066 [2024-07-15 17:29:07.194030] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2817586 ] 00:16:56.066 [2024-07-15 17:29:07.281125] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:56.066 [2024-07-15 17:29:07.345477] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:56.327 [2024-07-15 17:29:07.388062] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.327 [2024-07-15 17:29:07.388086] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:56.897 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:57.158 malloc1 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:57.158 [2024-07-15 17:29:08.386364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:57.158 [2024-07-15 17:29:08.386398] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.158 [2024-07-15 17:29:08.386409] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15e7a20 00:16:57.158 [2024-07-15 17:29:08.386415] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.158 [2024-07-15 17:29:08.387736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.158 [2024-07-15 17:29:08.387755] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:57.158 pt1 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:57.158 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:57.419 malloc2 00:16:57.419 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:57.679 [2024-07-15 17:29:08.773397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:57.679 [2024-07-15 17:29:08.773425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.679 [2024-07-15 17:29:08.773437] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15e8040 00:16:57.679 [2024-07-15 17:29:08.773443] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.679 [2024-07-15 17:29:08.774637] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.679 [2024-07-15 17:29:08.774654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:57.679 pt2 00:16:57.679 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:57.679 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:57.679 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:57.679 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:57.679 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:57.679 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:57.679 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:57.679 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:57.679 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:57.679 malloc3 00:16:57.939 17:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:57.939 [2024-07-15 17:29:09.160274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:57.939 [2024-07-15 17:29:09.160301] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.939 [2024-07-15 17:29:09.160311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15e8540 00:16:57.939 [2024-07-15 17:29:09.160317] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.939 [2024-07-15 17:29:09.161509] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.939 [2024-07-15 17:29:09.161527] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:57.939 pt3 00:16:57.939 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:57.939 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:57.939 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:16:57.939 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:16:57.939 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:57.939 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:57.939 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:57.939 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:57.939 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:58.198 malloc4 00:16:58.198 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:58.457 [2024-07-15 17:29:09.547175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:58.458 [2024-07-15 17:29:09.547203] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.458 [2024-07-15 17:29:09.547212] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1795d60 00:16:58.458 [2024-07-15 17:29:09.547218] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.458 [2024-07-15 17:29:09.548403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.458 [2024-07-15 17:29:09.548421] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:58.458 pt4 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:58.458 [2024-07-15 17:29:09.735676] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:58.458 [2024-07-15 17:29:09.736679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:58.458 [2024-07-15 17:29:09.736725] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:58.458 [2024-07-15 17:29:09.736759] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:58.458 [2024-07-15 17:29:09.736892] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1792e20 00:16:58.458 [2024-07-15 17:29:09.736899] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:58.458 [2024-07-15 17:29:09.737049] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15e9000 00:16:58.458 [2024-07-15 17:29:09.737158] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1792e20 00:16:58.458 [2024-07-15 17:29:09.737163] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1792e20 00:16:58.458 [2024-07-15 17:29:09.737234] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.458 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.718 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.718 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.718 "name": "raid_bdev1", 00:16:58.718 "uuid": "9f184cdf-4428-4fa4-84ba-07c67f3719f1", 00:16:58.718 "strip_size_kb": 64, 00:16:58.718 "state": "online", 00:16:58.718 "raid_level": "raid0", 00:16:58.718 "superblock": true, 00:16:58.718 "num_base_bdevs": 4, 00:16:58.718 "num_base_bdevs_discovered": 4, 00:16:58.718 "num_base_bdevs_operational": 4, 00:16:58.718 "base_bdevs_list": [ 00:16:58.718 { 00:16:58.718 "name": "pt1", 00:16:58.718 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.718 "is_configured": true, 00:16:58.718 "data_offset": 2048, 00:16:58.718 "data_size": 63488 00:16:58.718 }, 00:16:58.718 { 00:16:58.718 "name": "pt2", 00:16:58.718 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.718 "is_configured": true, 00:16:58.718 "data_offset": 2048, 00:16:58.718 "data_size": 63488 00:16:58.718 }, 00:16:58.718 { 00:16:58.718 "name": "pt3", 00:16:58.718 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.718 "is_configured": true, 00:16:58.718 "data_offset": 2048, 00:16:58.718 "data_size": 63488 00:16:58.718 }, 00:16:58.718 { 00:16:58.718 "name": "pt4", 00:16:58.718 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:58.718 "is_configured": true, 00:16:58.718 "data_offset": 2048, 00:16:58.718 "data_size": 63488 00:16:58.718 } 00:16:58.718 ] 00:16:58.718 }' 00:16:58.718 17:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.718 17:29:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.289 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:59.289 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:59.289 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:59.289 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:59.289 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:59.289 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:59.289 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:59.289 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:59.569 [2024-07-15 17:29:10.674279] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:59.569 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:59.569 "name": "raid_bdev1", 00:16:59.569 "aliases": [ 00:16:59.569 "9f184cdf-4428-4fa4-84ba-07c67f3719f1" 00:16:59.569 ], 00:16:59.569 "product_name": "Raid Volume", 00:16:59.569 "block_size": 512, 00:16:59.569 "num_blocks": 253952, 00:16:59.569 "uuid": "9f184cdf-4428-4fa4-84ba-07c67f3719f1", 00:16:59.569 "assigned_rate_limits": { 00:16:59.569 "rw_ios_per_sec": 0, 00:16:59.569 "rw_mbytes_per_sec": 0, 00:16:59.569 "r_mbytes_per_sec": 0, 00:16:59.569 "w_mbytes_per_sec": 0 00:16:59.569 }, 00:16:59.569 "claimed": false, 00:16:59.569 "zoned": false, 00:16:59.569 "supported_io_types": { 00:16:59.569 "read": true, 00:16:59.569 "write": true, 00:16:59.569 "unmap": true, 00:16:59.569 "flush": true, 00:16:59.569 "reset": true, 00:16:59.569 "nvme_admin": false, 00:16:59.569 "nvme_io": false, 00:16:59.569 "nvme_io_md": false, 00:16:59.569 "write_zeroes": true, 00:16:59.569 "zcopy": false, 00:16:59.569 "get_zone_info": false, 00:16:59.569 "zone_management": false, 00:16:59.569 "zone_append": false, 00:16:59.569 "compare": false, 00:16:59.569 "compare_and_write": false, 00:16:59.569 "abort": false, 00:16:59.569 "seek_hole": false, 00:16:59.569 "seek_data": false, 00:16:59.569 "copy": false, 00:16:59.569 "nvme_iov_md": false 00:16:59.569 }, 00:16:59.569 "memory_domains": [ 00:16:59.569 { 00:16:59.569 "dma_device_id": "system", 00:16:59.569 "dma_device_type": 1 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.569 "dma_device_type": 2 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "dma_device_id": "system", 00:16:59.569 "dma_device_type": 1 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.569 "dma_device_type": 2 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "dma_device_id": "system", 00:16:59.569 "dma_device_type": 1 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.569 "dma_device_type": 2 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "dma_device_id": "system", 00:16:59.569 "dma_device_type": 1 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.569 "dma_device_type": 2 00:16:59.569 } 00:16:59.569 ], 00:16:59.569 "driver_specific": { 00:16:59.569 "raid": { 00:16:59.569 "uuid": "9f184cdf-4428-4fa4-84ba-07c67f3719f1", 00:16:59.569 "strip_size_kb": 64, 00:16:59.569 "state": "online", 00:16:59.569 "raid_level": "raid0", 00:16:59.569 "superblock": true, 00:16:59.569 "num_base_bdevs": 4, 00:16:59.569 "num_base_bdevs_discovered": 4, 00:16:59.569 "num_base_bdevs_operational": 4, 00:16:59.569 "base_bdevs_list": [ 00:16:59.569 { 00:16:59.569 "name": "pt1", 00:16:59.569 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.569 "is_configured": true, 00:16:59.569 "data_offset": 2048, 00:16:59.569 "data_size": 63488 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "name": "pt2", 00:16:59.569 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:59.569 "is_configured": true, 00:16:59.569 "data_offset": 2048, 00:16:59.569 "data_size": 63488 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "name": "pt3", 00:16:59.569 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:59.569 "is_configured": true, 00:16:59.569 "data_offset": 2048, 00:16:59.569 "data_size": 63488 00:16:59.569 }, 00:16:59.569 { 00:16:59.569 "name": "pt4", 00:16:59.569 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:59.569 "is_configured": true, 00:16:59.569 "data_offset": 2048, 00:16:59.569 "data_size": 63488 00:16:59.569 } 00:16:59.569 ] 00:16:59.569 } 00:16:59.569 } 00:16:59.569 }' 00:16:59.569 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:59.569 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:59.569 pt2 00:16:59.569 pt3 00:16:59.569 pt4' 00:16:59.569 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:59.569 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:59.569 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:59.829 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:59.829 "name": "pt1", 00:16:59.829 "aliases": [ 00:16:59.829 "00000000-0000-0000-0000-000000000001" 00:16:59.829 ], 00:16:59.829 "product_name": "passthru", 00:16:59.829 "block_size": 512, 00:16:59.829 "num_blocks": 65536, 00:16:59.829 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.829 "assigned_rate_limits": { 00:16:59.829 "rw_ios_per_sec": 0, 00:16:59.829 "rw_mbytes_per_sec": 0, 00:16:59.829 "r_mbytes_per_sec": 0, 00:16:59.829 "w_mbytes_per_sec": 0 00:16:59.829 }, 00:16:59.829 "claimed": true, 00:16:59.829 "claim_type": "exclusive_write", 00:16:59.829 "zoned": false, 00:16:59.829 "supported_io_types": { 00:16:59.829 "read": true, 00:16:59.829 "write": true, 00:16:59.829 "unmap": true, 00:16:59.829 "flush": true, 00:16:59.829 "reset": true, 00:16:59.829 "nvme_admin": false, 00:16:59.829 "nvme_io": false, 00:16:59.829 "nvme_io_md": false, 00:16:59.829 "write_zeroes": true, 00:16:59.829 "zcopy": true, 00:16:59.829 "get_zone_info": false, 00:16:59.829 "zone_management": false, 00:16:59.829 "zone_append": false, 00:16:59.829 "compare": false, 00:16:59.829 "compare_and_write": false, 00:16:59.829 "abort": true, 00:16:59.829 "seek_hole": false, 00:16:59.829 "seek_data": false, 00:16:59.829 "copy": true, 00:16:59.829 "nvme_iov_md": false 00:16:59.829 }, 00:16:59.829 "memory_domains": [ 00:16:59.829 { 00:16:59.829 "dma_device_id": "system", 00:16:59.829 "dma_device_type": 1 00:16:59.829 }, 00:16:59.829 { 00:16:59.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.829 "dma_device_type": 2 00:16:59.829 } 00:16:59.829 ], 00:16:59.829 "driver_specific": { 00:16:59.829 "passthru": { 00:16:59.829 "name": "pt1", 00:16:59.829 "base_bdev_name": "malloc1" 00:16:59.829 } 00:16:59.829 } 00:16:59.829 }' 00:16:59.829 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.829 17:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.829 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:59.829 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.829 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.829 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:59.829 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.089 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.089 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.089 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.089 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.089 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.089 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:00.090 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:00.090 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:00.350 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:00.350 "name": "pt2", 00:17:00.350 "aliases": [ 00:17:00.350 "00000000-0000-0000-0000-000000000002" 00:17:00.350 ], 00:17:00.350 "product_name": "passthru", 00:17:00.350 "block_size": 512, 00:17:00.350 "num_blocks": 65536, 00:17:00.350 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:00.350 "assigned_rate_limits": { 00:17:00.350 "rw_ios_per_sec": 0, 00:17:00.350 "rw_mbytes_per_sec": 0, 00:17:00.350 "r_mbytes_per_sec": 0, 00:17:00.350 "w_mbytes_per_sec": 0 00:17:00.350 }, 00:17:00.350 "claimed": true, 00:17:00.350 "claim_type": "exclusive_write", 00:17:00.350 "zoned": false, 00:17:00.350 "supported_io_types": { 00:17:00.350 "read": true, 00:17:00.350 "write": true, 00:17:00.350 "unmap": true, 00:17:00.350 "flush": true, 00:17:00.350 "reset": true, 00:17:00.350 "nvme_admin": false, 00:17:00.350 "nvme_io": false, 00:17:00.350 "nvme_io_md": false, 00:17:00.350 "write_zeroes": true, 00:17:00.350 "zcopy": true, 00:17:00.350 "get_zone_info": false, 00:17:00.350 "zone_management": false, 00:17:00.350 "zone_append": false, 00:17:00.350 "compare": false, 00:17:00.350 "compare_and_write": false, 00:17:00.350 "abort": true, 00:17:00.350 "seek_hole": false, 00:17:00.350 "seek_data": false, 00:17:00.350 "copy": true, 00:17:00.350 "nvme_iov_md": false 00:17:00.350 }, 00:17:00.350 "memory_domains": [ 00:17:00.350 { 00:17:00.350 "dma_device_id": "system", 00:17:00.350 "dma_device_type": 1 00:17:00.350 }, 00:17:00.350 { 00:17:00.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.350 "dma_device_type": 2 00:17:00.350 } 00:17:00.350 ], 00:17:00.350 "driver_specific": { 00:17:00.350 "passthru": { 00:17:00.350 "name": "pt2", 00:17:00.350 "base_bdev_name": "malloc2" 00:17:00.350 } 00:17:00.350 } 00:17:00.350 }' 00:17:00.350 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.350 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.350 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:00.350 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.350 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:00.610 17:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:00.870 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:00.870 "name": "pt3", 00:17:00.870 "aliases": [ 00:17:00.870 "00000000-0000-0000-0000-000000000003" 00:17:00.870 ], 00:17:00.870 "product_name": "passthru", 00:17:00.870 "block_size": 512, 00:17:00.870 "num_blocks": 65536, 00:17:00.870 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:00.870 "assigned_rate_limits": { 00:17:00.870 "rw_ios_per_sec": 0, 00:17:00.870 "rw_mbytes_per_sec": 0, 00:17:00.870 "r_mbytes_per_sec": 0, 00:17:00.870 "w_mbytes_per_sec": 0 00:17:00.870 }, 00:17:00.870 "claimed": true, 00:17:00.870 "claim_type": "exclusive_write", 00:17:00.870 "zoned": false, 00:17:00.870 "supported_io_types": { 00:17:00.870 "read": true, 00:17:00.870 "write": true, 00:17:00.870 "unmap": true, 00:17:00.870 "flush": true, 00:17:00.870 "reset": true, 00:17:00.870 "nvme_admin": false, 00:17:00.870 "nvme_io": false, 00:17:00.870 "nvme_io_md": false, 00:17:00.870 "write_zeroes": true, 00:17:00.870 "zcopy": true, 00:17:00.870 "get_zone_info": false, 00:17:00.870 "zone_management": false, 00:17:00.870 "zone_append": false, 00:17:00.871 "compare": false, 00:17:00.871 "compare_and_write": false, 00:17:00.871 "abort": true, 00:17:00.871 "seek_hole": false, 00:17:00.871 "seek_data": false, 00:17:00.871 "copy": true, 00:17:00.871 "nvme_iov_md": false 00:17:00.871 }, 00:17:00.871 "memory_domains": [ 00:17:00.871 { 00:17:00.871 "dma_device_id": "system", 00:17:00.871 "dma_device_type": 1 00:17:00.871 }, 00:17:00.871 { 00:17:00.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.871 "dma_device_type": 2 00:17:00.871 } 00:17:00.871 ], 00:17:00.871 "driver_specific": { 00:17:00.871 "passthru": { 00:17:00.871 "name": "pt3", 00:17:00.871 "base_bdev_name": "malloc3" 00:17:00.871 } 00:17:00.871 } 00:17:00.871 }' 00:17:00.871 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.871 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.871 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:00.871 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.871 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:01.131 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:01.392 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:01.392 "name": "pt4", 00:17:01.392 "aliases": [ 00:17:01.392 "00000000-0000-0000-0000-000000000004" 00:17:01.392 ], 00:17:01.392 "product_name": "passthru", 00:17:01.392 "block_size": 512, 00:17:01.392 "num_blocks": 65536, 00:17:01.392 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:01.392 "assigned_rate_limits": { 00:17:01.392 "rw_ios_per_sec": 0, 00:17:01.392 "rw_mbytes_per_sec": 0, 00:17:01.392 "r_mbytes_per_sec": 0, 00:17:01.392 "w_mbytes_per_sec": 0 00:17:01.392 }, 00:17:01.392 "claimed": true, 00:17:01.392 "claim_type": "exclusive_write", 00:17:01.392 "zoned": false, 00:17:01.392 "supported_io_types": { 00:17:01.392 "read": true, 00:17:01.392 "write": true, 00:17:01.392 "unmap": true, 00:17:01.392 "flush": true, 00:17:01.392 "reset": true, 00:17:01.392 "nvme_admin": false, 00:17:01.392 "nvme_io": false, 00:17:01.392 "nvme_io_md": false, 00:17:01.392 "write_zeroes": true, 00:17:01.392 "zcopy": true, 00:17:01.392 "get_zone_info": false, 00:17:01.392 "zone_management": false, 00:17:01.392 "zone_append": false, 00:17:01.392 "compare": false, 00:17:01.392 "compare_and_write": false, 00:17:01.392 "abort": true, 00:17:01.392 "seek_hole": false, 00:17:01.392 "seek_data": false, 00:17:01.392 "copy": true, 00:17:01.392 "nvme_iov_md": false 00:17:01.392 }, 00:17:01.392 "memory_domains": [ 00:17:01.392 { 00:17:01.392 "dma_device_id": "system", 00:17:01.392 "dma_device_type": 1 00:17:01.392 }, 00:17:01.392 { 00:17:01.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.392 "dma_device_type": 2 00:17:01.392 } 00:17:01.392 ], 00:17:01.392 "driver_specific": { 00:17:01.392 "passthru": { 00:17:01.392 "name": "pt4", 00:17:01.392 "base_bdev_name": "malloc4" 00:17:01.392 } 00:17:01.392 } 00:17:01.392 }' 00:17:01.392 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.392 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.392 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:01.392 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:01.652 17:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:01.912 [2024-07-15 17:29:13.076369] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:01.912 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9f184cdf-4428-4fa4-84ba-07c67f3719f1 00:17:01.912 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9f184cdf-4428-4fa4-84ba-07c67f3719f1 ']' 00:17:01.912 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:02.171 [2024-07-15 17:29:13.268609] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:02.171 [2024-07-15 17:29:13.268624] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:02.171 [2024-07-15 17:29:13.268661] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:02.171 [2024-07-15 17:29:13.268707] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:02.171 [2024-07-15 17:29:13.268722] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1792e20 name raid_bdev1, state offline 00:17:02.171 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.171 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:02.430 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:02.430 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:02.430 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.430 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:02.430 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.430 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:02.691 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.691 17:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:02.951 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.951 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:02.951 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:02.951 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:03.211 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:03.212 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:03.472 [2024-07-15 17:29:14.559832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:03.472 [2024-07-15 17:29:14.560900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:03.472 [2024-07-15 17:29:14.560933] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:03.472 [2024-07-15 17:29:14.560959] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:17:03.472 [2024-07-15 17:29:14.560991] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:03.472 [2024-07-15 17:29:14.561019] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:03.472 [2024-07-15 17:29:14.561033] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:03.472 [2024-07-15 17:29:14.561046] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:17:03.472 [2024-07-15 17:29:14.561056] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:03.472 [2024-07-15 17:29:14.561062] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1790c80 name raid_bdev1, state configuring 00:17:03.472 request: 00:17:03.472 { 00:17:03.472 "name": "raid_bdev1", 00:17:03.472 "raid_level": "raid0", 00:17:03.472 "base_bdevs": [ 00:17:03.472 "malloc1", 00:17:03.472 "malloc2", 00:17:03.472 "malloc3", 00:17:03.472 "malloc4" 00:17:03.472 ], 00:17:03.472 "strip_size_kb": 64, 00:17:03.472 "superblock": false, 00:17:03.472 "method": "bdev_raid_create", 00:17:03.472 "req_id": 1 00:17:03.472 } 00:17:03.472 Got JSON-RPC error response 00:17:03.472 response: 00:17:03.472 { 00:17:03.472 "code": -17, 00:17:03.472 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:03.472 } 00:17:03.472 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:03.472 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:03.472 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:03.472 17:29:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:03.472 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.472 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:03.472 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:03.472 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:03.472 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:03.733 [2024-07-15 17:29:14.944760] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:03.733 [2024-07-15 17:29:14.944791] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:03.733 [2024-07-15 17:29:14.944802] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1796a20 00:17:03.733 [2024-07-15 17:29:14.944808] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:03.733 [2024-07-15 17:29:14.946082] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:03.733 [2024-07-15 17:29:14.946101] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:03.733 [2024-07-15 17:29:14.946148] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:03.733 [2024-07-15 17:29:14.946167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:03.733 pt1 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:03.733 17:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.994 17:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.994 "name": "raid_bdev1", 00:17:03.994 "uuid": "9f184cdf-4428-4fa4-84ba-07c67f3719f1", 00:17:03.994 "strip_size_kb": 64, 00:17:03.994 "state": "configuring", 00:17:03.994 "raid_level": "raid0", 00:17:03.994 "superblock": true, 00:17:03.994 "num_base_bdevs": 4, 00:17:03.994 "num_base_bdevs_discovered": 1, 00:17:03.994 "num_base_bdevs_operational": 4, 00:17:03.994 "base_bdevs_list": [ 00:17:03.994 { 00:17:03.994 "name": "pt1", 00:17:03.994 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:03.994 "is_configured": true, 00:17:03.994 "data_offset": 2048, 00:17:03.994 "data_size": 63488 00:17:03.994 }, 00:17:03.994 { 00:17:03.994 "name": null, 00:17:03.994 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:03.994 "is_configured": false, 00:17:03.994 "data_offset": 2048, 00:17:03.994 "data_size": 63488 00:17:03.994 }, 00:17:03.994 { 00:17:03.994 "name": null, 00:17:03.994 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:03.994 "is_configured": false, 00:17:03.994 "data_offset": 2048, 00:17:03.994 "data_size": 63488 00:17:03.994 }, 00:17:03.994 { 00:17:03.994 "name": null, 00:17:03.994 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:03.994 "is_configured": false, 00:17:03.994 "data_offset": 2048, 00:17:03.994 "data_size": 63488 00:17:03.994 } 00:17:03.994 ] 00:17:03.994 }' 00:17:03.994 17:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.994 17:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.564 17:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:17:04.564 17:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:04.825 [2024-07-15 17:29:15.879119] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:04.825 [2024-07-15 17:29:15.879145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:04.825 [2024-07-15 17:29:15.879157] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1795b20 00:17:04.825 [2024-07-15 17:29:15.879163] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:04.825 [2024-07-15 17:29:15.879415] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:04.825 [2024-07-15 17:29:15.879425] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:04.825 [2024-07-15 17:29:15.879464] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:04.825 [2024-07-15 17:29:15.879477] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:04.825 pt2 00:17:04.825 17:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:04.825 [2024-07-15 17:29:16.067613] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:04.826 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.086 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.086 "name": "raid_bdev1", 00:17:05.086 "uuid": "9f184cdf-4428-4fa4-84ba-07c67f3719f1", 00:17:05.086 "strip_size_kb": 64, 00:17:05.086 "state": "configuring", 00:17:05.086 "raid_level": "raid0", 00:17:05.086 "superblock": true, 00:17:05.086 "num_base_bdevs": 4, 00:17:05.086 "num_base_bdevs_discovered": 1, 00:17:05.086 "num_base_bdevs_operational": 4, 00:17:05.086 "base_bdevs_list": [ 00:17:05.086 { 00:17:05.086 "name": "pt1", 00:17:05.086 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:05.086 "is_configured": true, 00:17:05.086 "data_offset": 2048, 00:17:05.086 "data_size": 63488 00:17:05.086 }, 00:17:05.086 { 00:17:05.086 "name": null, 00:17:05.086 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:05.086 "is_configured": false, 00:17:05.086 "data_offset": 2048, 00:17:05.086 "data_size": 63488 00:17:05.086 }, 00:17:05.086 { 00:17:05.086 "name": null, 00:17:05.086 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:05.086 "is_configured": false, 00:17:05.086 "data_offset": 2048, 00:17:05.086 "data_size": 63488 00:17:05.086 }, 00:17:05.086 { 00:17:05.086 "name": null, 00:17:05.086 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:05.086 "is_configured": false, 00:17:05.086 "data_offset": 2048, 00:17:05.086 "data_size": 63488 00:17:05.086 } 00:17:05.086 ] 00:17:05.086 }' 00:17:05.086 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.086 17:29:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.658 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:05.658 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:05.658 17:29:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:05.920 [2024-07-15 17:29:16.993952] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:05.920 [2024-07-15 17:29:16.993990] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.920 [2024-07-15 17:29:16.994001] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1797570 00:17:05.920 [2024-07-15 17:29:16.994008] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.920 [2024-07-15 17:29:16.994271] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.920 [2024-07-15 17:29:16.994282] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:05.920 [2024-07-15 17:29:16.994327] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:05.920 [2024-07-15 17:29:16.994339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:05.920 pt2 00:17:05.920 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:05.920 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:05.920 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:05.920 [2024-07-15 17:29:17.190445] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:05.920 [2024-07-15 17:29:17.190463] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.920 [2024-07-15 17:29:17.190471] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17920d0 00:17:05.920 [2024-07-15 17:29:17.190476] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.920 [2024-07-15 17:29:17.190692] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.920 [2024-07-15 17:29:17.190702] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:05.920 [2024-07-15 17:29:17.190741] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:05.920 [2024-07-15 17:29:17.190752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:05.920 pt3 00:17:05.920 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:05.920 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:05.920 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:06.181 [2024-07-15 17:29:17.378925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:06.181 [2024-07-15 17:29:17.378945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.181 [2024-07-15 17:29:17.378953] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15e6f90 00:17:06.181 [2024-07-15 17:29:17.378959] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.181 [2024-07-15 17:29:17.379166] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.181 [2024-07-15 17:29:17.379176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:06.181 [2024-07-15 17:29:17.379209] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:06.181 [2024-07-15 17:29:17.379220] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:06.181 [2024-07-15 17:29:17.379309] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1797960 00:17:06.181 [2024-07-15 17:29:17.379315] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:06.181 [2024-07-15 17:29:17.379458] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1792a70 00:17:06.181 [2024-07-15 17:29:17.379558] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1797960 00:17:06.181 [2024-07-15 17:29:17.379563] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1797960 00:17:06.181 [2024-07-15 17:29:17.379635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:06.181 pt4 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.181 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:06.441 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.441 "name": "raid_bdev1", 00:17:06.441 "uuid": "9f184cdf-4428-4fa4-84ba-07c67f3719f1", 00:17:06.441 "strip_size_kb": 64, 00:17:06.441 "state": "online", 00:17:06.441 "raid_level": "raid0", 00:17:06.441 "superblock": true, 00:17:06.441 "num_base_bdevs": 4, 00:17:06.441 "num_base_bdevs_discovered": 4, 00:17:06.441 "num_base_bdevs_operational": 4, 00:17:06.441 "base_bdevs_list": [ 00:17:06.441 { 00:17:06.441 "name": "pt1", 00:17:06.441 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:06.441 "is_configured": true, 00:17:06.441 "data_offset": 2048, 00:17:06.442 "data_size": 63488 00:17:06.442 }, 00:17:06.442 { 00:17:06.442 "name": "pt2", 00:17:06.442 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:06.442 "is_configured": true, 00:17:06.442 "data_offset": 2048, 00:17:06.442 "data_size": 63488 00:17:06.442 }, 00:17:06.442 { 00:17:06.442 "name": "pt3", 00:17:06.442 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:06.442 "is_configured": true, 00:17:06.442 "data_offset": 2048, 00:17:06.442 "data_size": 63488 00:17:06.442 }, 00:17:06.442 { 00:17:06.442 "name": "pt4", 00:17:06.442 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:06.442 "is_configured": true, 00:17:06.442 "data_offset": 2048, 00:17:06.442 "data_size": 63488 00:17:06.442 } 00:17:06.442 ] 00:17:06.442 }' 00:17:06.442 17:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.442 17:29:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.011 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:07.011 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:07.011 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:07.011 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:07.011 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:07.011 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:07.011 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:07.011 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:07.272 [2024-07-15 17:29:18.329583] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:07.272 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:07.272 "name": "raid_bdev1", 00:17:07.272 "aliases": [ 00:17:07.272 "9f184cdf-4428-4fa4-84ba-07c67f3719f1" 00:17:07.272 ], 00:17:07.272 "product_name": "Raid Volume", 00:17:07.272 "block_size": 512, 00:17:07.272 "num_blocks": 253952, 00:17:07.272 "uuid": "9f184cdf-4428-4fa4-84ba-07c67f3719f1", 00:17:07.272 "assigned_rate_limits": { 00:17:07.272 "rw_ios_per_sec": 0, 00:17:07.272 "rw_mbytes_per_sec": 0, 00:17:07.272 "r_mbytes_per_sec": 0, 00:17:07.272 "w_mbytes_per_sec": 0 00:17:07.272 }, 00:17:07.272 "claimed": false, 00:17:07.272 "zoned": false, 00:17:07.272 "supported_io_types": { 00:17:07.272 "read": true, 00:17:07.272 "write": true, 00:17:07.272 "unmap": true, 00:17:07.272 "flush": true, 00:17:07.272 "reset": true, 00:17:07.272 "nvme_admin": false, 00:17:07.272 "nvme_io": false, 00:17:07.272 "nvme_io_md": false, 00:17:07.272 "write_zeroes": true, 00:17:07.272 "zcopy": false, 00:17:07.272 "get_zone_info": false, 00:17:07.272 "zone_management": false, 00:17:07.272 "zone_append": false, 00:17:07.272 "compare": false, 00:17:07.272 "compare_and_write": false, 00:17:07.272 "abort": false, 00:17:07.272 "seek_hole": false, 00:17:07.272 "seek_data": false, 00:17:07.272 "copy": false, 00:17:07.272 "nvme_iov_md": false 00:17:07.272 }, 00:17:07.272 "memory_domains": [ 00:17:07.272 { 00:17:07.272 "dma_device_id": "system", 00:17:07.272 "dma_device_type": 1 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.272 "dma_device_type": 2 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "dma_device_id": "system", 00:17:07.272 "dma_device_type": 1 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.272 "dma_device_type": 2 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "dma_device_id": "system", 00:17:07.272 "dma_device_type": 1 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.272 "dma_device_type": 2 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "dma_device_id": "system", 00:17:07.272 "dma_device_type": 1 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.272 "dma_device_type": 2 00:17:07.272 } 00:17:07.272 ], 00:17:07.272 "driver_specific": { 00:17:07.272 "raid": { 00:17:07.272 "uuid": "9f184cdf-4428-4fa4-84ba-07c67f3719f1", 00:17:07.272 "strip_size_kb": 64, 00:17:07.272 "state": "online", 00:17:07.272 "raid_level": "raid0", 00:17:07.272 "superblock": true, 00:17:07.272 "num_base_bdevs": 4, 00:17:07.272 "num_base_bdevs_discovered": 4, 00:17:07.272 "num_base_bdevs_operational": 4, 00:17:07.272 "base_bdevs_list": [ 00:17:07.272 { 00:17:07.272 "name": "pt1", 00:17:07.272 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:07.272 "is_configured": true, 00:17:07.272 "data_offset": 2048, 00:17:07.272 "data_size": 63488 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "name": "pt2", 00:17:07.272 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.272 "is_configured": true, 00:17:07.272 "data_offset": 2048, 00:17:07.272 "data_size": 63488 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "name": "pt3", 00:17:07.272 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:07.272 "is_configured": true, 00:17:07.272 "data_offset": 2048, 00:17:07.272 "data_size": 63488 00:17:07.272 }, 00:17:07.272 { 00:17:07.272 "name": "pt4", 00:17:07.272 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:07.272 "is_configured": true, 00:17:07.272 "data_offset": 2048, 00:17:07.272 "data_size": 63488 00:17:07.272 } 00:17:07.272 ] 00:17:07.272 } 00:17:07.272 } 00:17:07.272 }' 00:17:07.272 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:07.272 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:07.272 pt2 00:17:07.272 pt3 00:17:07.272 pt4' 00:17:07.272 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.272 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:07.272 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.532 "name": "pt1", 00:17:07.532 "aliases": [ 00:17:07.532 "00000000-0000-0000-0000-000000000001" 00:17:07.532 ], 00:17:07.532 "product_name": "passthru", 00:17:07.532 "block_size": 512, 00:17:07.532 "num_blocks": 65536, 00:17:07.532 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:07.532 "assigned_rate_limits": { 00:17:07.532 "rw_ios_per_sec": 0, 00:17:07.532 "rw_mbytes_per_sec": 0, 00:17:07.532 "r_mbytes_per_sec": 0, 00:17:07.532 "w_mbytes_per_sec": 0 00:17:07.532 }, 00:17:07.532 "claimed": true, 00:17:07.532 "claim_type": "exclusive_write", 00:17:07.532 "zoned": false, 00:17:07.532 "supported_io_types": { 00:17:07.532 "read": true, 00:17:07.532 "write": true, 00:17:07.532 "unmap": true, 00:17:07.532 "flush": true, 00:17:07.532 "reset": true, 00:17:07.532 "nvme_admin": false, 00:17:07.532 "nvme_io": false, 00:17:07.532 "nvme_io_md": false, 00:17:07.532 "write_zeroes": true, 00:17:07.532 "zcopy": true, 00:17:07.532 "get_zone_info": false, 00:17:07.532 "zone_management": false, 00:17:07.532 "zone_append": false, 00:17:07.532 "compare": false, 00:17:07.532 "compare_and_write": false, 00:17:07.532 "abort": true, 00:17:07.532 "seek_hole": false, 00:17:07.532 "seek_data": false, 00:17:07.532 "copy": true, 00:17:07.532 "nvme_iov_md": false 00:17:07.532 }, 00:17:07.532 "memory_domains": [ 00:17:07.532 { 00:17:07.532 "dma_device_id": "system", 00:17:07.532 "dma_device_type": 1 00:17:07.532 }, 00:17:07.532 { 00:17:07.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.532 "dma_device_type": 2 00:17:07.532 } 00:17:07.532 ], 00:17:07.532 "driver_specific": { 00:17:07.532 "passthru": { 00:17:07.532 "name": "pt1", 00:17:07.532 "base_bdev_name": "malloc1" 00:17:07.532 } 00:17:07.532 } 00:17:07.532 }' 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.532 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.793 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.793 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.793 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.793 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.793 17:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:07.793 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.793 "name": "pt2", 00:17:07.793 "aliases": [ 00:17:07.793 "00000000-0000-0000-0000-000000000002" 00:17:07.793 ], 00:17:07.793 "product_name": "passthru", 00:17:07.793 "block_size": 512, 00:17:07.793 "num_blocks": 65536, 00:17:07.793 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.793 "assigned_rate_limits": { 00:17:07.793 "rw_ios_per_sec": 0, 00:17:07.793 "rw_mbytes_per_sec": 0, 00:17:07.793 "r_mbytes_per_sec": 0, 00:17:07.793 "w_mbytes_per_sec": 0 00:17:07.793 }, 00:17:07.793 "claimed": true, 00:17:07.793 "claim_type": "exclusive_write", 00:17:07.793 "zoned": false, 00:17:07.793 "supported_io_types": { 00:17:07.793 "read": true, 00:17:07.793 "write": true, 00:17:07.793 "unmap": true, 00:17:07.793 "flush": true, 00:17:07.793 "reset": true, 00:17:07.793 "nvme_admin": false, 00:17:07.793 "nvme_io": false, 00:17:07.793 "nvme_io_md": false, 00:17:07.793 "write_zeroes": true, 00:17:07.793 "zcopy": true, 00:17:07.793 "get_zone_info": false, 00:17:07.793 "zone_management": false, 00:17:07.793 "zone_append": false, 00:17:07.793 "compare": false, 00:17:07.793 "compare_and_write": false, 00:17:07.793 "abort": true, 00:17:07.793 "seek_hole": false, 00:17:07.793 "seek_data": false, 00:17:07.793 "copy": true, 00:17:07.793 "nvme_iov_md": false 00:17:07.793 }, 00:17:07.793 "memory_domains": [ 00:17:07.793 { 00:17:07.793 "dma_device_id": "system", 00:17:07.793 "dma_device_type": 1 00:17:07.793 }, 00:17:07.793 { 00:17:07.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.793 "dma_device_type": 2 00:17:07.793 } 00:17:07.793 ], 00:17:07.793 "driver_specific": { 00:17:07.793 "passthru": { 00:17:07.793 "name": "pt2", 00:17:07.793 "base_bdev_name": "malloc2" 00:17:07.793 } 00:17:07.793 } 00:17:07.793 }' 00:17:07.793 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.053 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.053 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:08.053 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.053 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.053 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.053 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.053 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.053 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.053 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.313 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.313 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.313 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:08.313 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:08.313 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:08.313 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:08.313 "name": "pt3", 00:17:08.313 "aliases": [ 00:17:08.313 "00000000-0000-0000-0000-000000000003" 00:17:08.313 ], 00:17:08.313 "product_name": "passthru", 00:17:08.313 "block_size": 512, 00:17:08.313 "num_blocks": 65536, 00:17:08.313 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:08.313 "assigned_rate_limits": { 00:17:08.313 "rw_ios_per_sec": 0, 00:17:08.313 "rw_mbytes_per_sec": 0, 00:17:08.313 "r_mbytes_per_sec": 0, 00:17:08.313 "w_mbytes_per_sec": 0 00:17:08.313 }, 00:17:08.313 "claimed": true, 00:17:08.313 "claim_type": "exclusive_write", 00:17:08.313 "zoned": false, 00:17:08.313 "supported_io_types": { 00:17:08.313 "read": true, 00:17:08.313 "write": true, 00:17:08.313 "unmap": true, 00:17:08.313 "flush": true, 00:17:08.313 "reset": true, 00:17:08.313 "nvme_admin": false, 00:17:08.313 "nvme_io": false, 00:17:08.313 "nvme_io_md": false, 00:17:08.313 "write_zeroes": true, 00:17:08.313 "zcopy": true, 00:17:08.313 "get_zone_info": false, 00:17:08.313 "zone_management": false, 00:17:08.313 "zone_append": false, 00:17:08.313 "compare": false, 00:17:08.313 "compare_and_write": false, 00:17:08.313 "abort": true, 00:17:08.313 "seek_hole": false, 00:17:08.313 "seek_data": false, 00:17:08.313 "copy": true, 00:17:08.313 "nvme_iov_md": false 00:17:08.313 }, 00:17:08.313 "memory_domains": [ 00:17:08.313 { 00:17:08.313 "dma_device_id": "system", 00:17:08.313 "dma_device_type": 1 00:17:08.313 }, 00:17:08.313 { 00:17:08.313 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.313 "dma_device_type": 2 00:17:08.313 } 00:17:08.313 ], 00:17:08.313 "driver_specific": { 00:17:08.313 "passthru": { 00:17:08.313 "name": "pt3", 00:17:08.313 "base_bdev_name": "malloc3" 00:17:08.313 } 00:17:08.313 } 00:17:08.313 }' 00:17:08.313 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.574 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.574 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:08.574 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.574 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.574 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.574 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.574 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.574 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.574 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.833 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.833 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.833 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:08.833 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:08.833 17:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:08.833 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:08.833 "name": "pt4", 00:17:08.833 "aliases": [ 00:17:08.833 "00000000-0000-0000-0000-000000000004" 00:17:08.833 ], 00:17:08.833 "product_name": "passthru", 00:17:08.833 "block_size": 512, 00:17:08.833 "num_blocks": 65536, 00:17:08.833 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:08.833 "assigned_rate_limits": { 00:17:08.833 "rw_ios_per_sec": 0, 00:17:08.833 "rw_mbytes_per_sec": 0, 00:17:08.833 "r_mbytes_per_sec": 0, 00:17:08.833 "w_mbytes_per_sec": 0 00:17:08.833 }, 00:17:08.833 "claimed": true, 00:17:08.833 "claim_type": "exclusive_write", 00:17:08.833 "zoned": false, 00:17:08.833 "supported_io_types": { 00:17:08.833 "read": true, 00:17:08.833 "write": true, 00:17:08.833 "unmap": true, 00:17:08.833 "flush": true, 00:17:08.833 "reset": true, 00:17:08.833 "nvme_admin": false, 00:17:08.833 "nvme_io": false, 00:17:08.833 "nvme_io_md": false, 00:17:08.833 "write_zeroes": true, 00:17:08.833 "zcopy": true, 00:17:08.833 "get_zone_info": false, 00:17:08.833 "zone_management": false, 00:17:08.833 "zone_append": false, 00:17:08.833 "compare": false, 00:17:08.833 "compare_and_write": false, 00:17:08.833 "abort": true, 00:17:08.833 "seek_hole": false, 00:17:08.833 "seek_data": false, 00:17:08.833 "copy": true, 00:17:08.833 "nvme_iov_md": false 00:17:08.833 }, 00:17:08.833 "memory_domains": [ 00:17:08.833 { 00:17:08.833 "dma_device_id": "system", 00:17:08.833 "dma_device_type": 1 00:17:08.833 }, 00:17:08.833 { 00:17:08.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.833 "dma_device_type": 2 00:17:08.833 } 00:17:08.833 ], 00:17:08.833 "driver_specific": { 00:17:08.833 "passthru": { 00:17:08.833 "name": "pt4", 00:17:08.833 "base_bdev_name": "malloc4" 00:17:08.833 } 00:17:08.833 } 00:17:08.833 }' 00:17:08.833 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.092 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.092 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:09.092 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.092 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.092 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:09.092 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.092 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.092 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:09.092 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.351 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.351 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:09.351 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:09.351 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:09.610 [2024-07-15 17:29:20.651452] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9f184cdf-4428-4fa4-84ba-07c67f3719f1 '!=' 9f184cdf-4428-4fa4-84ba-07c67f3719f1 ']' 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2817586 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2817586 ']' 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2817586 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2817586 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2817586' 00:17:09.610 killing process with pid 2817586 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2817586 00:17:09.610 [2024-07-15 17:29:20.724727] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:09.610 [2024-07-15 17:29:20.724775] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:09.610 [2024-07-15 17:29:20.724824] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:09.610 [2024-07-15 17:29:20.724832] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1797960 name raid_bdev1, state offline 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2817586 00:17:09.610 [2024-07-15 17:29:20.745456] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:09.610 00:17:09.610 real 0m13.729s 00:17:09.610 user 0m25.316s 00:17:09.610 sys 0m1.959s 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:09.610 17:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.610 ************************************ 00:17:09.610 END TEST raid_superblock_test 00:17:09.610 ************************************ 00:17:09.610 17:29:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:09.870 17:29:20 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:17:09.870 17:29:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:09.870 17:29:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:09.870 17:29:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:09.870 ************************************ 00:17:09.870 START TEST raid_read_error_test 00:17:09.870 ************************************ 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:09.870 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.GoWYC1O5gu 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2820091 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2820091 /var/tmp/spdk-raid.sock 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2820091 ']' 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:09.871 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:09.871 17:29:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.871 [2024-07-15 17:29:21.014109] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:17:09.871 [2024-07-15 17:29:21.014168] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2820091 ] 00:17:09.871 [2024-07-15 17:29:21.106487] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:10.131 [2024-07-15 17:29:21.183510] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:10.131 [2024-07-15 17:29:21.235233] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:10.131 [2024-07-15 17:29:21.235261] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:10.733 17:29:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:10.733 17:29:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:10.733 17:29:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:10.733 17:29:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:11.005 BaseBdev1_malloc 00:17:11.005 17:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:11.005 true 00:17:11.005 17:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:11.275 [2024-07-15 17:29:22.431069] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:11.275 [2024-07-15 17:29:22.431103] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:11.275 [2024-07-15 17:29:22.431115] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1facb50 00:17:11.275 [2024-07-15 17:29:22.431121] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:11.275 [2024-07-15 17:29:22.432407] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:11.275 [2024-07-15 17:29:22.432427] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:11.275 BaseBdev1 00:17:11.275 17:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:11.275 17:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:11.534 BaseBdev2_malloc 00:17:11.534 17:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:11.534 true 00:17:11.793 17:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:11.793 [2024-07-15 17:29:23.010180] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:11.793 [2024-07-15 17:29:23.010211] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:11.793 [2024-07-15 17:29:23.010226] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f90ea0 00:17:11.793 [2024-07-15 17:29:23.010233] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:11.793 [2024-07-15 17:29:23.011390] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:11.793 [2024-07-15 17:29:23.011410] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:11.793 BaseBdev2 00:17:11.793 17:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:11.793 17:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:12.052 BaseBdev3_malloc 00:17:12.052 17:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:12.311 true 00:17:12.311 17:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:12.311 [2024-07-15 17:29:23.565346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:12.311 [2024-07-15 17:29:23.565372] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.311 [2024-07-15 17:29:23.565382] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f94fb0 00:17:12.311 [2024-07-15 17:29:23.565389] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.311 [2024-07-15 17:29:23.566529] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.311 [2024-07-15 17:29:23.566548] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:12.311 BaseBdev3 00:17:12.311 17:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:12.311 17:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:12.570 BaseBdev4_malloc 00:17:12.570 17:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:12.830 true 00:17:12.830 17:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:13.090 [2024-07-15 17:29:24.128379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:13.090 [2024-07-15 17:29:24.128407] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:13.090 [2024-07-15 17:29:24.128417] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f96980 00:17:13.090 [2024-07-15 17:29:24.128424] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:13.090 [2024-07-15 17:29:24.129581] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:13.090 [2024-07-15 17:29:24.129600] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:13.090 BaseBdev4 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:13.090 [2024-07-15 17:29:24.320885] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:13.090 [2024-07-15 17:29:24.321862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:13.090 [2024-07-15 17:29:24.321913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:13.090 [2024-07-15 17:29:24.321958] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:13.090 [2024-07-15 17:29:24.322133] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f964e0 00:17:13.090 [2024-07-15 17:29:24.322144] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:13.090 [2024-07-15 17:29:24.322284] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df8210 00:17:13.090 [2024-07-15 17:29:24.322398] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f964e0 00:17:13.090 [2024-07-15 17:29:24.322403] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f964e0 00:17:13.090 [2024-07-15 17:29:24.322477] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.090 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:13.350 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.350 "name": "raid_bdev1", 00:17:13.350 "uuid": "a76a6997-56e1-448e-a5d6-6b987e3d6a9f", 00:17:13.350 "strip_size_kb": 64, 00:17:13.350 "state": "online", 00:17:13.350 "raid_level": "raid0", 00:17:13.350 "superblock": true, 00:17:13.350 "num_base_bdevs": 4, 00:17:13.350 "num_base_bdevs_discovered": 4, 00:17:13.350 "num_base_bdevs_operational": 4, 00:17:13.350 "base_bdevs_list": [ 00:17:13.350 { 00:17:13.350 "name": "BaseBdev1", 00:17:13.350 "uuid": "cc5c69b8-a54a-5492-98da-3d983adfacc7", 00:17:13.350 "is_configured": true, 00:17:13.350 "data_offset": 2048, 00:17:13.350 "data_size": 63488 00:17:13.350 }, 00:17:13.350 { 00:17:13.350 "name": "BaseBdev2", 00:17:13.350 "uuid": "988b3b65-3243-56f4-8b0f-bf1ab9126868", 00:17:13.350 "is_configured": true, 00:17:13.350 "data_offset": 2048, 00:17:13.350 "data_size": 63488 00:17:13.350 }, 00:17:13.350 { 00:17:13.350 "name": "BaseBdev3", 00:17:13.350 "uuid": "0d8339d0-6063-5b84-9052-350688505e2f", 00:17:13.350 "is_configured": true, 00:17:13.350 "data_offset": 2048, 00:17:13.350 "data_size": 63488 00:17:13.350 }, 00:17:13.350 { 00:17:13.350 "name": "BaseBdev4", 00:17:13.350 "uuid": "a90773a1-3634-59be-a21f-5aa36c80c893", 00:17:13.350 "is_configured": true, 00:17:13.350 "data_offset": 2048, 00:17:13.350 "data_size": 63488 00:17:13.350 } 00:17:13.350 ] 00:17:13.350 }' 00:17:13.350 17:29:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.350 17:29:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.918 17:29:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:13.918 17:29:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:13.918 [2024-07-15 17:29:25.139173] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df8450 00:17:14.856 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.115 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:15.375 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.375 "name": "raid_bdev1", 00:17:15.375 "uuid": "a76a6997-56e1-448e-a5d6-6b987e3d6a9f", 00:17:15.375 "strip_size_kb": 64, 00:17:15.375 "state": "online", 00:17:15.375 "raid_level": "raid0", 00:17:15.375 "superblock": true, 00:17:15.375 "num_base_bdevs": 4, 00:17:15.375 "num_base_bdevs_discovered": 4, 00:17:15.375 "num_base_bdevs_operational": 4, 00:17:15.375 "base_bdevs_list": [ 00:17:15.375 { 00:17:15.375 "name": "BaseBdev1", 00:17:15.375 "uuid": "cc5c69b8-a54a-5492-98da-3d983adfacc7", 00:17:15.375 "is_configured": true, 00:17:15.375 "data_offset": 2048, 00:17:15.375 "data_size": 63488 00:17:15.375 }, 00:17:15.375 { 00:17:15.375 "name": "BaseBdev2", 00:17:15.375 "uuid": "988b3b65-3243-56f4-8b0f-bf1ab9126868", 00:17:15.375 "is_configured": true, 00:17:15.375 "data_offset": 2048, 00:17:15.375 "data_size": 63488 00:17:15.375 }, 00:17:15.375 { 00:17:15.375 "name": "BaseBdev3", 00:17:15.375 "uuid": "0d8339d0-6063-5b84-9052-350688505e2f", 00:17:15.375 "is_configured": true, 00:17:15.375 "data_offset": 2048, 00:17:15.375 "data_size": 63488 00:17:15.375 }, 00:17:15.375 { 00:17:15.375 "name": "BaseBdev4", 00:17:15.375 "uuid": "a90773a1-3634-59be-a21f-5aa36c80c893", 00:17:15.375 "is_configured": true, 00:17:15.375 "data_offset": 2048, 00:17:15.375 "data_size": 63488 00:17:15.375 } 00:17:15.375 ] 00:17:15.375 }' 00:17:15.375 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.375 17:29:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.945 17:29:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:15.945 [2024-07-15 17:29:27.151250] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:15.945 [2024-07-15 17:29:27.151283] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:15.945 [2024-07-15 17:29:27.153877] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:15.945 [2024-07-15 17:29:27.153907] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:15.945 [2024-07-15 17:29:27.153935] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:15.945 [2024-07-15 17:29:27.153941] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f964e0 name raid_bdev1, state offline 00:17:15.945 0 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2820091 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2820091 ']' 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2820091 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2820091 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2820091' 00:17:15.945 killing process with pid 2820091 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2820091 00:17:15.945 [2024-07-15 17:29:27.238507] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:15.945 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2820091 00:17:16.205 [2024-07-15 17:29:27.255441] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.GoWYC1O5gu 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:17:16.205 00:17:16.205 real 0m6.441s 00:17:16.205 user 0m10.354s 00:17:16.205 sys 0m0.921s 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:16.205 17:29:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.205 ************************************ 00:17:16.205 END TEST raid_read_error_test 00:17:16.205 ************************************ 00:17:16.205 17:29:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:16.206 17:29:27 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:17:16.206 17:29:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:16.206 17:29:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:16.206 17:29:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:16.206 ************************************ 00:17:16.206 START TEST raid_write_error_test 00:17:16.206 ************************************ 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.k3Sp7T5qIn 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2821382 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2821382 /var/tmp/spdk-raid.sock 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2821382 ']' 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:16.206 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:16.206 17:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.465 [2024-07-15 17:29:27.532804] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:17:16.465 [2024-07-15 17:29:27.532858] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2821382 ] 00:17:16.465 [2024-07-15 17:29:27.603826] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:16.465 [2024-07-15 17:29:27.666029] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:16.465 [2024-07-15 17:29:27.717921] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:16.466 [2024-07-15 17:29:27.717951] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:16.466 17:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:16.466 17:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:16.466 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:16.466 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:16.725 BaseBdev1_malloc 00:17:16.725 17:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:16.984 true 00:17:16.984 17:29:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:17.245 [2024-07-15 17:29:28.331881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:17.245 [2024-07-15 17:29:28.331913] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:17.245 [2024-07-15 17:29:28.331924] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18feb50 00:17:17.245 [2024-07-15 17:29:28.331935] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:17.245 [2024-07-15 17:29:28.333217] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:17.245 [2024-07-15 17:29:28.333236] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:17.245 BaseBdev1 00:17:17.245 17:29:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:17.245 17:29:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:17.814 BaseBdev2_malloc 00:17:17.814 17:29:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:17.814 true 00:17:17.814 17:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:18.073 [2024-07-15 17:29:29.268007] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:18.073 [2024-07-15 17:29:29.268037] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.073 [2024-07-15 17:29:29.268048] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e2ea0 00:17:18.073 [2024-07-15 17:29:29.268054] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.073 [2024-07-15 17:29:29.269212] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.073 [2024-07-15 17:29:29.269232] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:18.073 BaseBdev2 00:17:18.073 17:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:18.073 17:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:18.333 BaseBdev3_malloc 00:17:18.333 17:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:18.593 true 00:17:18.593 17:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:18.593 [2024-07-15 17:29:29.831122] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:18.593 [2024-07-15 17:29:29.831147] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.593 [2024-07-15 17:29:29.831158] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e6fb0 00:17:18.593 [2024-07-15 17:29:29.831165] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.593 [2024-07-15 17:29:29.832308] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.593 [2024-07-15 17:29:29.832326] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:18.593 BaseBdev3 00:17:18.593 17:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:18.593 17:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:18.853 BaseBdev4_malloc 00:17:18.853 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:19.113 true 00:17:19.113 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:19.113 [2024-07-15 17:29:30.402326] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:19.113 [2024-07-15 17:29:30.402359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:19.113 [2024-07-15 17:29:30.402375] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e8980 00:17:19.113 [2024-07-15 17:29:30.402381] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:19.113 [2024-07-15 17:29:30.403550] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:19.113 [2024-07-15 17:29:30.403569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:19.113 BaseBdev4 00:17:19.373 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:19.373 [2024-07-15 17:29:30.594839] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:19.373 [2024-07-15 17:29:30.595846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:19.374 [2024-07-15 17:29:30.595897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:19.374 [2024-07-15 17:29:30.595942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:19.374 [2024-07-15 17:29:30.596119] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e84e0 00:17:19.374 [2024-07-15 17:29:30.596126] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:19.374 [2024-07-15 17:29:30.596263] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x174a210 00:17:19.374 [2024-07-15 17:29:30.596377] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e84e0 00:17:19.374 [2024-07-15 17:29:30.596383] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18e84e0 00:17:19.374 [2024-07-15 17:29:30.596456] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.374 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:19.633 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.633 "name": "raid_bdev1", 00:17:19.633 "uuid": "fdf8ad9f-6577-445b-abad-f8a6e1fa1fa0", 00:17:19.633 "strip_size_kb": 64, 00:17:19.633 "state": "online", 00:17:19.633 "raid_level": "raid0", 00:17:19.633 "superblock": true, 00:17:19.633 "num_base_bdevs": 4, 00:17:19.633 "num_base_bdevs_discovered": 4, 00:17:19.633 "num_base_bdevs_operational": 4, 00:17:19.634 "base_bdevs_list": [ 00:17:19.634 { 00:17:19.634 "name": "BaseBdev1", 00:17:19.634 "uuid": "43cc5d58-6a33-5b43-b92a-71361edcdd0d", 00:17:19.634 "is_configured": true, 00:17:19.634 "data_offset": 2048, 00:17:19.634 "data_size": 63488 00:17:19.634 }, 00:17:19.634 { 00:17:19.634 "name": "BaseBdev2", 00:17:19.634 "uuid": "04b3d262-f9ca-522c-a931-effe887a4b07", 00:17:19.634 "is_configured": true, 00:17:19.634 "data_offset": 2048, 00:17:19.634 "data_size": 63488 00:17:19.634 }, 00:17:19.634 { 00:17:19.634 "name": "BaseBdev3", 00:17:19.634 "uuid": "50873720-c92f-5067-b94f-1148d371de48", 00:17:19.634 "is_configured": true, 00:17:19.634 "data_offset": 2048, 00:17:19.634 "data_size": 63488 00:17:19.634 }, 00:17:19.634 { 00:17:19.634 "name": "BaseBdev4", 00:17:19.634 "uuid": "245663c7-719c-5888-9123-124213e6cda5", 00:17:19.634 "is_configured": true, 00:17:19.634 "data_offset": 2048, 00:17:19.634 "data_size": 63488 00:17:19.634 } 00:17:19.634 ] 00:17:19.634 }' 00:17:19.634 17:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.634 17:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.203 17:29:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:20.203 17:29:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:20.203 [2024-07-15 17:29:31.417124] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x174a450 00:17:21.143 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.402 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:21.662 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.662 "name": "raid_bdev1", 00:17:21.662 "uuid": "fdf8ad9f-6577-445b-abad-f8a6e1fa1fa0", 00:17:21.662 "strip_size_kb": 64, 00:17:21.662 "state": "online", 00:17:21.662 "raid_level": "raid0", 00:17:21.662 "superblock": true, 00:17:21.662 "num_base_bdevs": 4, 00:17:21.662 "num_base_bdevs_discovered": 4, 00:17:21.662 "num_base_bdevs_operational": 4, 00:17:21.662 "base_bdevs_list": [ 00:17:21.662 { 00:17:21.662 "name": "BaseBdev1", 00:17:21.662 "uuid": "43cc5d58-6a33-5b43-b92a-71361edcdd0d", 00:17:21.662 "is_configured": true, 00:17:21.662 "data_offset": 2048, 00:17:21.662 "data_size": 63488 00:17:21.662 }, 00:17:21.662 { 00:17:21.662 "name": "BaseBdev2", 00:17:21.662 "uuid": "04b3d262-f9ca-522c-a931-effe887a4b07", 00:17:21.662 "is_configured": true, 00:17:21.662 "data_offset": 2048, 00:17:21.662 "data_size": 63488 00:17:21.662 }, 00:17:21.662 { 00:17:21.662 "name": "BaseBdev3", 00:17:21.662 "uuid": "50873720-c92f-5067-b94f-1148d371de48", 00:17:21.662 "is_configured": true, 00:17:21.662 "data_offset": 2048, 00:17:21.662 "data_size": 63488 00:17:21.662 }, 00:17:21.662 { 00:17:21.662 "name": "BaseBdev4", 00:17:21.662 "uuid": "245663c7-719c-5888-9123-124213e6cda5", 00:17:21.662 "is_configured": true, 00:17:21.662 "data_offset": 2048, 00:17:21.662 "data_size": 63488 00:17:21.662 } 00:17:21.662 ] 00:17:21.662 }' 00:17:21.662 17:29:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.662 17:29:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:22.233 [2024-07-15 17:29:33.452015] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:22.233 [2024-07-15 17:29:33.452047] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:22.233 [2024-07-15 17:29:33.454634] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:22.233 [2024-07-15 17:29:33.454664] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:22.233 [2024-07-15 17:29:33.454693] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:22.233 [2024-07-15 17:29:33.454698] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e84e0 name raid_bdev1, state offline 00:17:22.233 0 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2821382 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2821382 ']' 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2821382 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2821382 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2821382' 00:17:22.233 killing process with pid 2821382 00:17:22.233 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2821382 00:17:22.234 [2024-07-15 17:29:33.520701] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:22.234 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2821382 00:17:22.495 [2024-07-15 17:29:33.537943] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.k3Sp7T5qIn 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:17:22.495 00:17:22.495 real 0m6.209s 00:17:22.495 user 0m10.355s 00:17:22.495 sys 0m0.904s 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:22.495 17:29:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.495 ************************************ 00:17:22.495 END TEST raid_write_error_test 00:17:22.495 ************************************ 00:17:22.495 17:29:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:22.495 17:29:33 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:22.495 17:29:33 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:17:22.495 17:29:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:22.495 17:29:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:22.495 17:29:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:22.495 ************************************ 00:17:22.495 START TEST raid_state_function_test 00:17:22.495 ************************************ 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2822421 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2822421' 00:17:22.495 Process raid pid: 2822421 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2822421 /var/tmp/spdk-raid.sock 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2822421 ']' 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:22.495 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:22.495 17:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.755 [2024-07-15 17:29:33.817549] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:17:22.755 [2024-07-15 17:29:33.817611] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:22.755 [2024-07-15 17:29:33.909979] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:22.755 [2024-07-15 17:29:33.979389] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:22.755 [2024-07-15 17:29:34.021789] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:22.755 [2024-07-15 17:29:34.021810] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:23.694 [2024-07-15 17:29:34.833618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:23.694 [2024-07-15 17:29:34.833649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:23.694 [2024-07-15 17:29:34.833657] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:23.694 [2024-07-15 17:29:34.833663] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:23.694 [2024-07-15 17:29:34.833668] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:23.694 [2024-07-15 17:29:34.833673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:23.694 [2024-07-15 17:29:34.833678] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:23.694 [2024-07-15 17:29:34.833683] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.694 17:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.954 17:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.954 "name": "Existed_Raid", 00:17:23.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.954 "strip_size_kb": 64, 00:17:23.954 "state": "configuring", 00:17:23.954 "raid_level": "concat", 00:17:23.954 "superblock": false, 00:17:23.954 "num_base_bdevs": 4, 00:17:23.954 "num_base_bdevs_discovered": 0, 00:17:23.954 "num_base_bdevs_operational": 4, 00:17:23.954 "base_bdevs_list": [ 00:17:23.954 { 00:17:23.954 "name": "BaseBdev1", 00:17:23.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.954 "is_configured": false, 00:17:23.954 "data_offset": 0, 00:17:23.954 "data_size": 0 00:17:23.954 }, 00:17:23.954 { 00:17:23.954 "name": "BaseBdev2", 00:17:23.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.954 "is_configured": false, 00:17:23.954 "data_offset": 0, 00:17:23.954 "data_size": 0 00:17:23.954 }, 00:17:23.954 { 00:17:23.955 "name": "BaseBdev3", 00:17:23.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.955 "is_configured": false, 00:17:23.955 "data_offset": 0, 00:17:23.955 "data_size": 0 00:17:23.955 }, 00:17:23.955 { 00:17:23.955 "name": "BaseBdev4", 00:17:23.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.955 "is_configured": false, 00:17:23.955 "data_offset": 0, 00:17:23.955 "data_size": 0 00:17:23.955 } 00:17:23.955 ] 00:17:23.955 }' 00:17:23.955 17:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.955 17:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.525 17:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:24.525 [2024-07-15 17:29:35.751833] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:24.525 [2024-07-15 17:29:35.751850] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195b6f0 name Existed_Raid, state configuring 00:17:24.525 17:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:24.785 [2024-07-15 17:29:35.936321] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:24.785 [2024-07-15 17:29:35.936339] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:24.785 [2024-07-15 17:29:35.936344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:24.785 [2024-07-15 17:29:35.936350] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:24.785 [2024-07-15 17:29:35.936354] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:24.785 [2024-07-15 17:29:35.936360] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:24.785 [2024-07-15 17:29:35.936364] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:24.785 [2024-07-15 17:29:35.936370] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:24.785 17:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:25.046 [2024-07-15 17:29:36.139414] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:25.046 BaseBdev1 00:17:25.046 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:25.046 17:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:25.046 17:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:25.046 17:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:25.046 17:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:25.046 17:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:25.046 17:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:25.307 [ 00:17:25.307 { 00:17:25.307 "name": "BaseBdev1", 00:17:25.307 "aliases": [ 00:17:25.307 "ea191aac-7330-4b30-bb80-db303035717b" 00:17:25.307 ], 00:17:25.307 "product_name": "Malloc disk", 00:17:25.307 "block_size": 512, 00:17:25.307 "num_blocks": 65536, 00:17:25.307 "uuid": "ea191aac-7330-4b30-bb80-db303035717b", 00:17:25.307 "assigned_rate_limits": { 00:17:25.307 "rw_ios_per_sec": 0, 00:17:25.307 "rw_mbytes_per_sec": 0, 00:17:25.307 "r_mbytes_per_sec": 0, 00:17:25.307 "w_mbytes_per_sec": 0 00:17:25.307 }, 00:17:25.307 "claimed": true, 00:17:25.307 "claim_type": "exclusive_write", 00:17:25.307 "zoned": false, 00:17:25.307 "supported_io_types": { 00:17:25.307 "read": true, 00:17:25.307 "write": true, 00:17:25.307 "unmap": true, 00:17:25.307 "flush": true, 00:17:25.307 "reset": true, 00:17:25.307 "nvme_admin": false, 00:17:25.307 "nvme_io": false, 00:17:25.307 "nvme_io_md": false, 00:17:25.307 "write_zeroes": true, 00:17:25.307 "zcopy": true, 00:17:25.307 "get_zone_info": false, 00:17:25.307 "zone_management": false, 00:17:25.307 "zone_append": false, 00:17:25.307 "compare": false, 00:17:25.307 "compare_and_write": false, 00:17:25.307 "abort": true, 00:17:25.307 "seek_hole": false, 00:17:25.307 "seek_data": false, 00:17:25.307 "copy": true, 00:17:25.307 "nvme_iov_md": false 00:17:25.307 }, 00:17:25.307 "memory_domains": [ 00:17:25.307 { 00:17:25.307 "dma_device_id": "system", 00:17:25.307 "dma_device_type": 1 00:17:25.307 }, 00:17:25.307 { 00:17:25.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.307 "dma_device_type": 2 00:17:25.307 } 00:17:25.307 ], 00:17:25.307 "driver_specific": {} 00:17:25.307 } 00:17:25.307 ] 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.307 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:25.567 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.567 "name": "Existed_Raid", 00:17:25.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.567 "strip_size_kb": 64, 00:17:25.567 "state": "configuring", 00:17:25.567 "raid_level": "concat", 00:17:25.567 "superblock": false, 00:17:25.567 "num_base_bdevs": 4, 00:17:25.567 "num_base_bdevs_discovered": 1, 00:17:25.567 "num_base_bdevs_operational": 4, 00:17:25.567 "base_bdevs_list": [ 00:17:25.567 { 00:17:25.567 "name": "BaseBdev1", 00:17:25.567 "uuid": "ea191aac-7330-4b30-bb80-db303035717b", 00:17:25.567 "is_configured": true, 00:17:25.568 "data_offset": 0, 00:17:25.568 "data_size": 65536 00:17:25.568 }, 00:17:25.568 { 00:17:25.568 "name": "BaseBdev2", 00:17:25.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.568 "is_configured": false, 00:17:25.568 "data_offset": 0, 00:17:25.568 "data_size": 0 00:17:25.568 }, 00:17:25.568 { 00:17:25.568 "name": "BaseBdev3", 00:17:25.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.568 "is_configured": false, 00:17:25.568 "data_offset": 0, 00:17:25.568 "data_size": 0 00:17:25.568 }, 00:17:25.568 { 00:17:25.568 "name": "BaseBdev4", 00:17:25.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.568 "is_configured": false, 00:17:25.568 "data_offset": 0, 00:17:25.568 "data_size": 0 00:17:25.568 } 00:17:25.568 ] 00:17:25.568 }' 00:17:25.568 17:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.568 17:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:26.138 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:26.398 [2024-07-15 17:29:37.446725] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:26.398 [2024-07-15 17:29:37.446750] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195af60 name Existed_Raid, state configuring 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:26.398 [2024-07-15 17:29:37.631216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:26.398 [2024-07-15 17:29:37.632302] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:26.398 [2024-07-15 17:29:37.632325] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:26.398 [2024-07-15 17:29:37.632335] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:26.398 [2024-07-15 17:29:37.632341] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:26.398 [2024-07-15 17:29:37.632346] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:26.398 [2024-07-15 17:29:37.632351] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.398 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.659 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.659 "name": "Existed_Raid", 00:17:26.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.659 "strip_size_kb": 64, 00:17:26.659 "state": "configuring", 00:17:26.659 "raid_level": "concat", 00:17:26.659 "superblock": false, 00:17:26.659 "num_base_bdevs": 4, 00:17:26.659 "num_base_bdevs_discovered": 1, 00:17:26.659 "num_base_bdevs_operational": 4, 00:17:26.659 "base_bdevs_list": [ 00:17:26.659 { 00:17:26.659 "name": "BaseBdev1", 00:17:26.659 "uuid": "ea191aac-7330-4b30-bb80-db303035717b", 00:17:26.659 "is_configured": true, 00:17:26.659 "data_offset": 0, 00:17:26.659 "data_size": 65536 00:17:26.659 }, 00:17:26.659 { 00:17:26.659 "name": "BaseBdev2", 00:17:26.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.659 "is_configured": false, 00:17:26.659 "data_offset": 0, 00:17:26.659 "data_size": 0 00:17:26.659 }, 00:17:26.659 { 00:17:26.659 "name": "BaseBdev3", 00:17:26.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.659 "is_configured": false, 00:17:26.659 "data_offset": 0, 00:17:26.659 "data_size": 0 00:17:26.659 }, 00:17:26.659 { 00:17:26.659 "name": "BaseBdev4", 00:17:26.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.659 "is_configured": false, 00:17:26.659 "data_offset": 0, 00:17:26.659 "data_size": 0 00:17:26.659 } 00:17:26.659 ] 00:17:26.659 }' 00:17:26.659 17:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.659 17:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.229 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:27.518 [2024-07-15 17:29:38.538381] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:27.518 BaseBdev2 00:17:27.518 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:27.518 17:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:27.518 17:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:27.518 17:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:27.518 17:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:27.518 17:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:27.518 17:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:27.518 17:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:27.777 [ 00:17:27.777 { 00:17:27.777 "name": "BaseBdev2", 00:17:27.777 "aliases": [ 00:17:27.777 "82b08686-e701-4beb-992a-1f4ee7342da1" 00:17:27.777 ], 00:17:27.777 "product_name": "Malloc disk", 00:17:27.777 "block_size": 512, 00:17:27.777 "num_blocks": 65536, 00:17:27.777 "uuid": "82b08686-e701-4beb-992a-1f4ee7342da1", 00:17:27.777 "assigned_rate_limits": { 00:17:27.777 "rw_ios_per_sec": 0, 00:17:27.777 "rw_mbytes_per_sec": 0, 00:17:27.777 "r_mbytes_per_sec": 0, 00:17:27.777 "w_mbytes_per_sec": 0 00:17:27.777 }, 00:17:27.777 "claimed": true, 00:17:27.777 "claim_type": "exclusive_write", 00:17:27.777 "zoned": false, 00:17:27.777 "supported_io_types": { 00:17:27.777 "read": true, 00:17:27.777 "write": true, 00:17:27.777 "unmap": true, 00:17:27.777 "flush": true, 00:17:27.777 "reset": true, 00:17:27.777 "nvme_admin": false, 00:17:27.777 "nvme_io": false, 00:17:27.777 "nvme_io_md": false, 00:17:27.777 "write_zeroes": true, 00:17:27.777 "zcopy": true, 00:17:27.777 "get_zone_info": false, 00:17:27.777 "zone_management": false, 00:17:27.777 "zone_append": false, 00:17:27.777 "compare": false, 00:17:27.777 "compare_and_write": false, 00:17:27.777 "abort": true, 00:17:27.777 "seek_hole": false, 00:17:27.777 "seek_data": false, 00:17:27.777 "copy": true, 00:17:27.777 "nvme_iov_md": false 00:17:27.777 }, 00:17:27.777 "memory_domains": [ 00:17:27.777 { 00:17:27.777 "dma_device_id": "system", 00:17:27.777 "dma_device_type": 1 00:17:27.777 }, 00:17:27.777 { 00:17:27.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.777 "dma_device_type": 2 00:17:27.777 } 00:17:27.777 ], 00:17:27.777 "driver_specific": {} 00:17:27.777 } 00:17:27.777 ] 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.777 17:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.777 17:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.777 "name": "Existed_Raid", 00:17:27.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.777 "strip_size_kb": 64, 00:17:27.777 "state": "configuring", 00:17:27.777 "raid_level": "concat", 00:17:27.777 "superblock": false, 00:17:27.777 "num_base_bdevs": 4, 00:17:27.777 "num_base_bdevs_discovered": 2, 00:17:27.777 "num_base_bdevs_operational": 4, 00:17:27.777 "base_bdevs_list": [ 00:17:27.777 { 00:17:27.777 "name": "BaseBdev1", 00:17:27.777 "uuid": "ea191aac-7330-4b30-bb80-db303035717b", 00:17:27.777 "is_configured": true, 00:17:27.777 "data_offset": 0, 00:17:27.777 "data_size": 65536 00:17:27.777 }, 00:17:27.777 { 00:17:27.777 "name": "BaseBdev2", 00:17:27.777 "uuid": "82b08686-e701-4beb-992a-1f4ee7342da1", 00:17:27.777 "is_configured": true, 00:17:27.777 "data_offset": 0, 00:17:27.777 "data_size": 65536 00:17:27.777 }, 00:17:27.777 { 00:17:27.777 "name": "BaseBdev3", 00:17:27.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.777 "is_configured": false, 00:17:27.777 "data_offset": 0, 00:17:27.777 "data_size": 0 00:17:27.777 }, 00:17:27.777 { 00:17:27.777 "name": "BaseBdev4", 00:17:27.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.777 "is_configured": false, 00:17:27.777 "data_offset": 0, 00:17:27.777 "data_size": 0 00:17:27.777 } 00:17:27.777 ] 00:17:27.777 }' 00:17:27.777 17:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.777 17:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.416 17:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:28.416 [2024-07-15 17:29:39.702215] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:28.416 BaseBdev3 00:17:28.416 17:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:28.416 17:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:28.416 17:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:28.416 17:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:28.675 17:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:28.675 17:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:28.675 17:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:28.675 17:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:28.935 [ 00:17:28.935 { 00:17:28.935 "name": "BaseBdev3", 00:17:28.935 "aliases": [ 00:17:28.935 "870e3ea5-ffe6-4151-b5d0-23b95ae8adb7" 00:17:28.935 ], 00:17:28.935 "product_name": "Malloc disk", 00:17:28.935 "block_size": 512, 00:17:28.935 "num_blocks": 65536, 00:17:28.935 "uuid": "870e3ea5-ffe6-4151-b5d0-23b95ae8adb7", 00:17:28.935 "assigned_rate_limits": { 00:17:28.935 "rw_ios_per_sec": 0, 00:17:28.935 "rw_mbytes_per_sec": 0, 00:17:28.935 "r_mbytes_per_sec": 0, 00:17:28.935 "w_mbytes_per_sec": 0 00:17:28.935 }, 00:17:28.935 "claimed": true, 00:17:28.935 "claim_type": "exclusive_write", 00:17:28.935 "zoned": false, 00:17:28.935 "supported_io_types": { 00:17:28.935 "read": true, 00:17:28.935 "write": true, 00:17:28.935 "unmap": true, 00:17:28.935 "flush": true, 00:17:28.935 "reset": true, 00:17:28.935 "nvme_admin": false, 00:17:28.935 "nvme_io": false, 00:17:28.935 "nvme_io_md": false, 00:17:28.935 "write_zeroes": true, 00:17:28.935 "zcopy": true, 00:17:28.935 "get_zone_info": false, 00:17:28.935 "zone_management": false, 00:17:28.935 "zone_append": false, 00:17:28.935 "compare": false, 00:17:28.935 "compare_and_write": false, 00:17:28.935 "abort": true, 00:17:28.935 "seek_hole": false, 00:17:28.935 "seek_data": false, 00:17:28.935 "copy": true, 00:17:28.935 "nvme_iov_md": false 00:17:28.935 }, 00:17:28.935 "memory_domains": [ 00:17:28.935 { 00:17:28.935 "dma_device_id": "system", 00:17:28.935 "dma_device_type": 1 00:17:28.935 }, 00:17:28.935 { 00:17:28.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.935 "dma_device_type": 2 00:17:28.935 } 00:17:28.935 ], 00:17:28.935 "driver_specific": {} 00:17:28.935 } 00:17:28.935 ] 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.935 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:29.196 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:29.196 "name": "Existed_Raid", 00:17:29.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:29.196 "strip_size_kb": 64, 00:17:29.196 "state": "configuring", 00:17:29.196 "raid_level": "concat", 00:17:29.196 "superblock": false, 00:17:29.196 "num_base_bdevs": 4, 00:17:29.196 "num_base_bdevs_discovered": 3, 00:17:29.196 "num_base_bdevs_operational": 4, 00:17:29.196 "base_bdevs_list": [ 00:17:29.196 { 00:17:29.196 "name": "BaseBdev1", 00:17:29.196 "uuid": "ea191aac-7330-4b30-bb80-db303035717b", 00:17:29.196 "is_configured": true, 00:17:29.196 "data_offset": 0, 00:17:29.196 "data_size": 65536 00:17:29.196 }, 00:17:29.196 { 00:17:29.196 "name": "BaseBdev2", 00:17:29.196 "uuid": "82b08686-e701-4beb-992a-1f4ee7342da1", 00:17:29.196 "is_configured": true, 00:17:29.196 "data_offset": 0, 00:17:29.196 "data_size": 65536 00:17:29.196 }, 00:17:29.196 { 00:17:29.196 "name": "BaseBdev3", 00:17:29.196 "uuid": "870e3ea5-ffe6-4151-b5d0-23b95ae8adb7", 00:17:29.196 "is_configured": true, 00:17:29.196 "data_offset": 0, 00:17:29.196 "data_size": 65536 00:17:29.196 }, 00:17:29.196 { 00:17:29.196 "name": "BaseBdev4", 00:17:29.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:29.196 "is_configured": false, 00:17:29.196 "data_offset": 0, 00:17:29.196 "data_size": 0 00:17:29.196 } 00:17:29.196 ] 00:17:29.196 }' 00:17:29.196 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:29.196 17:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.767 17:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:29.767 [2024-07-15 17:29:40.994398] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:29.767 [2024-07-15 17:29:40.994425] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x195bfc0 00:17:29.767 [2024-07-15 17:29:40.994429] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:29.767 [2024-07-15 17:29:40.994594] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x195bc00 00:17:29.767 [2024-07-15 17:29:40.994687] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x195bfc0 00:17:29.767 [2024-07-15 17:29:40.994693] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x195bfc0 00:17:29.767 [2024-07-15 17:29:40.994822] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:29.767 BaseBdev4 00:17:29.767 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:29.767 17:29:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:29.767 17:29:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:29.767 17:29:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:29.767 17:29:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:29.767 17:29:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:29.767 17:29:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:30.028 [ 00:17:30.028 { 00:17:30.028 "name": "BaseBdev4", 00:17:30.028 "aliases": [ 00:17:30.028 "bfbf084d-a0bf-4087-92ac-b0e05257ccb5" 00:17:30.028 ], 00:17:30.028 "product_name": "Malloc disk", 00:17:30.028 "block_size": 512, 00:17:30.028 "num_blocks": 65536, 00:17:30.028 "uuid": "bfbf084d-a0bf-4087-92ac-b0e05257ccb5", 00:17:30.028 "assigned_rate_limits": { 00:17:30.028 "rw_ios_per_sec": 0, 00:17:30.028 "rw_mbytes_per_sec": 0, 00:17:30.028 "r_mbytes_per_sec": 0, 00:17:30.028 "w_mbytes_per_sec": 0 00:17:30.028 }, 00:17:30.028 "claimed": true, 00:17:30.028 "claim_type": "exclusive_write", 00:17:30.028 "zoned": false, 00:17:30.028 "supported_io_types": { 00:17:30.028 "read": true, 00:17:30.028 "write": true, 00:17:30.028 "unmap": true, 00:17:30.028 "flush": true, 00:17:30.028 "reset": true, 00:17:30.028 "nvme_admin": false, 00:17:30.028 "nvme_io": false, 00:17:30.028 "nvme_io_md": false, 00:17:30.028 "write_zeroes": true, 00:17:30.028 "zcopy": true, 00:17:30.028 "get_zone_info": false, 00:17:30.028 "zone_management": false, 00:17:30.028 "zone_append": false, 00:17:30.028 "compare": false, 00:17:30.028 "compare_and_write": false, 00:17:30.028 "abort": true, 00:17:30.028 "seek_hole": false, 00:17:30.028 "seek_data": false, 00:17:30.028 "copy": true, 00:17:30.028 "nvme_iov_md": false 00:17:30.028 }, 00:17:30.028 "memory_domains": [ 00:17:30.028 { 00:17:30.028 "dma_device_id": "system", 00:17:30.028 "dma_device_type": 1 00:17:30.028 }, 00:17:30.028 { 00:17:30.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.028 "dma_device_type": 2 00:17:30.028 } 00:17:30.028 ], 00:17:30.028 "driver_specific": {} 00:17:30.028 } 00:17:30.028 ] 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.028 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:30.289 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.289 "name": "Existed_Raid", 00:17:30.289 "uuid": "8648f312-62cd-41c1-9fa5-0ce04db84672", 00:17:30.289 "strip_size_kb": 64, 00:17:30.289 "state": "online", 00:17:30.289 "raid_level": "concat", 00:17:30.289 "superblock": false, 00:17:30.289 "num_base_bdevs": 4, 00:17:30.289 "num_base_bdevs_discovered": 4, 00:17:30.289 "num_base_bdevs_operational": 4, 00:17:30.289 "base_bdevs_list": [ 00:17:30.289 { 00:17:30.289 "name": "BaseBdev1", 00:17:30.289 "uuid": "ea191aac-7330-4b30-bb80-db303035717b", 00:17:30.289 "is_configured": true, 00:17:30.289 "data_offset": 0, 00:17:30.289 "data_size": 65536 00:17:30.289 }, 00:17:30.289 { 00:17:30.289 "name": "BaseBdev2", 00:17:30.289 "uuid": "82b08686-e701-4beb-992a-1f4ee7342da1", 00:17:30.289 "is_configured": true, 00:17:30.289 "data_offset": 0, 00:17:30.289 "data_size": 65536 00:17:30.289 }, 00:17:30.289 { 00:17:30.289 "name": "BaseBdev3", 00:17:30.289 "uuid": "870e3ea5-ffe6-4151-b5d0-23b95ae8adb7", 00:17:30.289 "is_configured": true, 00:17:30.289 "data_offset": 0, 00:17:30.289 "data_size": 65536 00:17:30.289 }, 00:17:30.289 { 00:17:30.289 "name": "BaseBdev4", 00:17:30.289 "uuid": "bfbf084d-a0bf-4087-92ac-b0e05257ccb5", 00:17:30.289 "is_configured": true, 00:17:30.289 "data_offset": 0, 00:17:30.289 "data_size": 65536 00:17:30.289 } 00:17:30.289 ] 00:17:30.289 }' 00:17:30.289 17:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.289 17:29:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.860 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:30.860 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:30.860 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:30.860 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:30.860 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:30.860 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:30.860 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:30.860 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:31.121 [2024-07-15 17:29:42.245825] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:31.121 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:31.121 "name": "Existed_Raid", 00:17:31.121 "aliases": [ 00:17:31.121 "8648f312-62cd-41c1-9fa5-0ce04db84672" 00:17:31.121 ], 00:17:31.121 "product_name": "Raid Volume", 00:17:31.121 "block_size": 512, 00:17:31.121 "num_blocks": 262144, 00:17:31.121 "uuid": "8648f312-62cd-41c1-9fa5-0ce04db84672", 00:17:31.121 "assigned_rate_limits": { 00:17:31.121 "rw_ios_per_sec": 0, 00:17:31.121 "rw_mbytes_per_sec": 0, 00:17:31.121 "r_mbytes_per_sec": 0, 00:17:31.121 "w_mbytes_per_sec": 0 00:17:31.121 }, 00:17:31.121 "claimed": false, 00:17:31.121 "zoned": false, 00:17:31.121 "supported_io_types": { 00:17:31.121 "read": true, 00:17:31.121 "write": true, 00:17:31.121 "unmap": true, 00:17:31.121 "flush": true, 00:17:31.121 "reset": true, 00:17:31.121 "nvme_admin": false, 00:17:31.121 "nvme_io": false, 00:17:31.121 "nvme_io_md": false, 00:17:31.121 "write_zeroes": true, 00:17:31.121 "zcopy": false, 00:17:31.121 "get_zone_info": false, 00:17:31.121 "zone_management": false, 00:17:31.121 "zone_append": false, 00:17:31.121 "compare": false, 00:17:31.121 "compare_and_write": false, 00:17:31.121 "abort": false, 00:17:31.121 "seek_hole": false, 00:17:31.121 "seek_data": false, 00:17:31.121 "copy": false, 00:17:31.121 "nvme_iov_md": false 00:17:31.121 }, 00:17:31.121 "memory_domains": [ 00:17:31.121 { 00:17:31.121 "dma_device_id": "system", 00:17:31.121 "dma_device_type": 1 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.121 "dma_device_type": 2 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "dma_device_id": "system", 00:17:31.121 "dma_device_type": 1 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.121 "dma_device_type": 2 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "dma_device_id": "system", 00:17:31.121 "dma_device_type": 1 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.121 "dma_device_type": 2 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "dma_device_id": "system", 00:17:31.121 "dma_device_type": 1 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.121 "dma_device_type": 2 00:17:31.121 } 00:17:31.121 ], 00:17:31.121 "driver_specific": { 00:17:31.121 "raid": { 00:17:31.121 "uuid": "8648f312-62cd-41c1-9fa5-0ce04db84672", 00:17:31.121 "strip_size_kb": 64, 00:17:31.121 "state": "online", 00:17:31.121 "raid_level": "concat", 00:17:31.121 "superblock": false, 00:17:31.121 "num_base_bdevs": 4, 00:17:31.121 "num_base_bdevs_discovered": 4, 00:17:31.121 "num_base_bdevs_operational": 4, 00:17:31.121 "base_bdevs_list": [ 00:17:31.121 { 00:17:31.121 "name": "BaseBdev1", 00:17:31.121 "uuid": "ea191aac-7330-4b30-bb80-db303035717b", 00:17:31.121 "is_configured": true, 00:17:31.121 "data_offset": 0, 00:17:31.121 "data_size": 65536 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "name": "BaseBdev2", 00:17:31.121 "uuid": "82b08686-e701-4beb-992a-1f4ee7342da1", 00:17:31.121 "is_configured": true, 00:17:31.121 "data_offset": 0, 00:17:31.121 "data_size": 65536 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "name": "BaseBdev3", 00:17:31.121 "uuid": "870e3ea5-ffe6-4151-b5d0-23b95ae8adb7", 00:17:31.121 "is_configured": true, 00:17:31.121 "data_offset": 0, 00:17:31.121 "data_size": 65536 00:17:31.121 }, 00:17:31.121 { 00:17:31.121 "name": "BaseBdev4", 00:17:31.121 "uuid": "bfbf084d-a0bf-4087-92ac-b0e05257ccb5", 00:17:31.121 "is_configured": true, 00:17:31.121 "data_offset": 0, 00:17:31.121 "data_size": 65536 00:17:31.121 } 00:17:31.121 ] 00:17:31.121 } 00:17:31.121 } 00:17:31.121 }' 00:17:31.121 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:31.121 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:31.121 BaseBdev2 00:17:31.121 BaseBdev3 00:17:31.121 BaseBdev4' 00:17:31.121 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.121 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:31.121 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.381 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.381 "name": "BaseBdev1", 00:17:31.381 "aliases": [ 00:17:31.381 "ea191aac-7330-4b30-bb80-db303035717b" 00:17:31.381 ], 00:17:31.381 "product_name": "Malloc disk", 00:17:31.381 "block_size": 512, 00:17:31.381 "num_blocks": 65536, 00:17:31.381 "uuid": "ea191aac-7330-4b30-bb80-db303035717b", 00:17:31.381 "assigned_rate_limits": { 00:17:31.381 "rw_ios_per_sec": 0, 00:17:31.381 "rw_mbytes_per_sec": 0, 00:17:31.381 "r_mbytes_per_sec": 0, 00:17:31.381 "w_mbytes_per_sec": 0 00:17:31.381 }, 00:17:31.381 "claimed": true, 00:17:31.381 "claim_type": "exclusive_write", 00:17:31.381 "zoned": false, 00:17:31.381 "supported_io_types": { 00:17:31.381 "read": true, 00:17:31.381 "write": true, 00:17:31.381 "unmap": true, 00:17:31.381 "flush": true, 00:17:31.381 "reset": true, 00:17:31.381 "nvme_admin": false, 00:17:31.381 "nvme_io": false, 00:17:31.381 "nvme_io_md": false, 00:17:31.381 "write_zeroes": true, 00:17:31.382 "zcopy": true, 00:17:31.382 "get_zone_info": false, 00:17:31.382 "zone_management": false, 00:17:31.382 "zone_append": false, 00:17:31.382 "compare": false, 00:17:31.382 "compare_and_write": false, 00:17:31.382 "abort": true, 00:17:31.382 "seek_hole": false, 00:17:31.382 "seek_data": false, 00:17:31.382 "copy": true, 00:17:31.382 "nvme_iov_md": false 00:17:31.382 }, 00:17:31.382 "memory_domains": [ 00:17:31.382 { 00:17:31.382 "dma_device_id": "system", 00:17:31.382 "dma_device_type": 1 00:17:31.382 }, 00:17:31.382 { 00:17:31.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.382 "dma_device_type": 2 00:17:31.382 } 00:17:31.382 ], 00:17:31.382 "driver_specific": {} 00:17:31.382 }' 00:17:31.382 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.382 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.382 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.382 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.382 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.382 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:31.382 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.642 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.642 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.642 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.642 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.642 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.642 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.642 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:31.642 17:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.902 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.902 "name": "BaseBdev2", 00:17:31.902 "aliases": [ 00:17:31.902 "82b08686-e701-4beb-992a-1f4ee7342da1" 00:17:31.902 ], 00:17:31.902 "product_name": "Malloc disk", 00:17:31.902 "block_size": 512, 00:17:31.902 "num_blocks": 65536, 00:17:31.902 "uuid": "82b08686-e701-4beb-992a-1f4ee7342da1", 00:17:31.902 "assigned_rate_limits": { 00:17:31.902 "rw_ios_per_sec": 0, 00:17:31.902 "rw_mbytes_per_sec": 0, 00:17:31.902 "r_mbytes_per_sec": 0, 00:17:31.902 "w_mbytes_per_sec": 0 00:17:31.902 }, 00:17:31.902 "claimed": true, 00:17:31.902 "claim_type": "exclusive_write", 00:17:31.902 "zoned": false, 00:17:31.902 "supported_io_types": { 00:17:31.902 "read": true, 00:17:31.902 "write": true, 00:17:31.902 "unmap": true, 00:17:31.902 "flush": true, 00:17:31.902 "reset": true, 00:17:31.902 "nvme_admin": false, 00:17:31.902 "nvme_io": false, 00:17:31.902 "nvme_io_md": false, 00:17:31.902 "write_zeroes": true, 00:17:31.902 "zcopy": true, 00:17:31.902 "get_zone_info": false, 00:17:31.902 "zone_management": false, 00:17:31.902 "zone_append": false, 00:17:31.902 "compare": false, 00:17:31.902 "compare_and_write": false, 00:17:31.902 "abort": true, 00:17:31.902 "seek_hole": false, 00:17:31.902 "seek_data": false, 00:17:31.902 "copy": true, 00:17:31.902 "nvme_iov_md": false 00:17:31.902 }, 00:17:31.902 "memory_domains": [ 00:17:31.902 { 00:17:31.902 "dma_device_id": "system", 00:17:31.902 "dma_device_type": 1 00:17:31.902 }, 00:17:31.902 { 00:17:31.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.902 "dma_device_type": 2 00:17:31.902 } 00:17:31.902 ], 00:17:31.902 "driver_specific": {} 00:17:31.902 }' 00:17:31.902 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.902 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.902 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.902 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.902 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:32.162 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.421 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.421 "name": "BaseBdev3", 00:17:32.421 "aliases": [ 00:17:32.421 "870e3ea5-ffe6-4151-b5d0-23b95ae8adb7" 00:17:32.421 ], 00:17:32.421 "product_name": "Malloc disk", 00:17:32.421 "block_size": 512, 00:17:32.421 "num_blocks": 65536, 00:17:32.421 "uuid": "870e3ea5-ffe6-4151-b5d0-23b95ae8adb7", 00:17:32.421 "assigned_rate_limits": { 00:17:32.421 "rw_ios_per_sec": 0, 00:17:32.421 "rw_mbytes_per_sec": 0, 00:17:32.421 "r_mbytes_per_sec": 0, 00:17:32.421 "w_mbytes_per_sec": 0 00:17:32.421 }, 00:17:32.421 "claimed": true, 00:17:32.421 "claim_type": "exclusive_write", 00:17:32.421 "zoned": false, 00:17:32.421 "supported_io_types": { 00:17:32.422 "read": true, 00:17:32.422 "write": true, 00:17:32.422 "unmap": true, 00:17:32.422 "flush": true, 00:17:32.422 "reset": true, 00:17:32.422 "nvme_admin": false, 00:17:32.422 "nvme_io": false, 00:17:32.422 "nvme_io_md": false, 00:17:32.422 "write_zeroes": true, 00:17:32.422 "zcopy": true, 00:17:32.422 "get_zone_info": false, 00:17:32.422 "zone_management": false, 00:17:32.422 "zone_append": false, 00:17:32.422 "compare": false, 00:17:32.422 "compare_and_write": false, 00:17:32.422 "abort": true, 00:17:32.422 "seek_hole": false, 00:17:32.422 "seek_data": false, 00:17:32.422 "copy": true, 00:17:32.422 "nvme_iov_md": false 00:17:32.422 }, 00:17:32.422 "memory_domains": [ 00:17:32.422 { 00:17:32.422 "dma_device_id": "system", 00:17:32.422 "dma_device_type": 1 00:17:32.422 }, 00:17:32.422 { 00:17:32.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.422 "dma_device_type": 2 00:17:32.422 } 00:17:32.422 ], 00:17:32.422 "driver_specific": {} 00:17:32.422 }' 00:17:32.422 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.422 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.422 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:32.422 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.681 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.681 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.681 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.681 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.681 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.681 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.681 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.941 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.941 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:32.941 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:32.941 17:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.941 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.941 "name": "BaseBdev4", 00:17:32.941 "aliases": [ 00:17:32.941 "bfbf084d-a0bf-4087-92ac-b0e05257ccb5" 00:17:32.941 ], 00:17:32.941 "product_name": "Malloc disk", 00:17:32.941 "block_size": 512, 00:17:32.941 "num_blocks": 65536, 00:17:32.941 "uuid": "bfbf084d-a0bf-4087-92ac-b0e05257ccb5", 00:17:32.941 "assigned_rate_limits": { 00:17:32.941 "rw_ios_per_sec": 0, 00:17:32.941 "rw_mbytes_per_sec": 0, 00:17:32.941 "r_mbytes_per_sec": 0, 00:17:32.941 "w_mbytes_per_sec": 0 00:17:32.941 }, 00:17:32.941 "claimed": true, 00:17:32.941 "claim_type": "exclusive_write", 00:17:32.941 "zoned": false, 00:17:32.941 "supported_io_types": { 00:17:32.941 "read": true, 00:17:32.941 "write": true, 00:17:32.941 "unmap": true, 00:17:32.941 "flush": true, 00:17:32.941 "reset": true, 00:17:32.941 "nvme_admin": false, 00:17:32.941 "nvme_io": false, 00:17:32.941 "nvme_io_md": false, 00:17:32.941 "write_zeroes": true, 00:17:32.941 "zcopy": true, 00:17:32.941 "get_zone_info": false, 00:17:32.941 "zone_management": false, 00:17:32.941 "zone_append": false, 00:17:32.941 "compare": false, 00:17:32.941 "compare_and_write": false, 00:17:32.941 "abort": true, 00:17:32.941 "seek_hole": false, 00:17:32.941 "seek_data": false, 00:17:32.941 "copy": true, 00:17:32.941 "nvme_iov_md": false 00:17:32.941 }, 00:17:32.941 "memory_domains": [ 00:17:32.941 { 00:17:32.941 "dma_device_id": "system", 00:17:32.941 "dma_device_type": 1 00:17:32.941 }, 00:17:32.941 { 00:17:32.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.941 "dma_device_type": 2 00:17:32.941 } 00:17:32.941 ], 00:17:32.941 "driver_specific": {} 00:17:32.941 }' 00:17:32.941 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.941 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.201 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:33.201 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.201 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.201 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:33.201 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.201 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.201 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.201 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.201 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:33.461 [2024-07-15 17:29:44.675739] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:33.461 [2024-07-15 17:29:44.675757] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:33.461 [2024-07-15 17:29:44.675791] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:33.461 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:33.462 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:33.462 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:33.462 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.462 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.462 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.462 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.462 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.462 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.722 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.722 "name": "Existed_Raid", 00:17:33.722 "uuid": "8648f312-62cd-41c1-9fa5-0ce04db84672", 00:17:33.722 "strip_size_kb": 64, 00:17:33.722 "state": "offline", 00:17:33.722 "raid_level": "concat", 00:17:33.722 "superblock": false, 00:17:33.722 "num_base_bdevs": 4, 00:17:33.722 "num_base_bdevs_discovered": 3, 00:17:33.722 "num_base_bdevs_operational": 3, 00:17:33.722 "base_bdevs_list": [ 00:17:33.722 { 00:17:33.722 "name": null, 00:17:33.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.722 "is_configured": false, 00:17:33.722 "data_offset": 0, 00:17:33.722 "data_size": 65536 00:17:33.722 }, 00:17:33.722 { 00:17:33.722 "name": "BaseBdev2", 00:17:33.722 "uuid": "82b08686-e701-4beb-992a-1f4ee7342da1", 00:17:33.722 "is_configured": true, 00:17:33.722 "data_offset": 0, 00:17:33.722 "data_size": 65536 00:17:33.722 }, 00:17:33.722 { 00:17:33.722 "name": "BaseBdev3", 00:17:33.722 "uuid": "870e3ea5-ffe6-4151-b5d0-23b95ae8adb7", 00:17:33.722 "is_configured": true, 00:17:33.722 "data_offset": 0, 00:17:33.722 "data_size": 65536 00:17:33.722 }, 00:17:33.722 { 00:17:33.722 "name": "BaseBdev4", 00:17:33.722 "uuid": "bfbf084d-a0bf-4087-92ac-b0e05257ccb5", 00:17:33.722 "is_configured": true, 00:17:33.722 "data_offset": 0, 00:17:33.722 "data_size": 65536 00:17:33.722 } 00:17:33.722 ] 00:17:33.722 }' 00:17:33.722 17:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.723 17:29:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.293 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:34.293 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:34.293 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.294 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:34.554 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:34.554 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:34.554 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:34.554 [2024-07-15 17:29:45.822656] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:34.554 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:34.554 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:34.554 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:34.554 17:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.814 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:34.814 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:34.814 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:35.075 [2024-07-15 17:29:46.213471] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:35.075 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:35.075 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:35.075 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.075 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:35.334 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:35.334 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:35.334 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:35.334 [2024-07-15 17:29:46.596186] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:35.334 [2024-07-15 17:29:46.596213] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195bfc0 name Existed_Raid, state offline 00:17:35.335 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:35.335 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:35.335 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.335 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:35.595 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:35.595 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:35.595 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:35.595 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:35.595 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:35.595 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:35.855 BaseBdev2 00:17:35.855 17:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:35.855 17:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:35.855 17:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:35.855 17:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:35.855 17:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:35.855 17:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:35.855 17:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:36.115 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:36.115 [ 00:17:36.115 { 00:17:36.115 "name": "BaseBdev2", 00:17:36.115 "aliases": [ 00:17:36.115 "d06ca452-3355-4090-93e1-a1d0e8900477" 00:17:36.115 ], 00:17:36.115 "product_name": "Malloc disk", 00:17:36.115 "block_size": 512, 00:17:36.115 "num_blocks": 65536, 00:17:36.115 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:36.115 "assigned_rate_limits": { 00:17:36.115 "rw_ios_per_sec": 0, 00:17:36.115 "rw_mbytes_per_sec": 0, 00:17:36.115 "r_mbytes_per_sec": 0, 00:17:36.115 "w_mbytes_per_sec": 0 00:17:36.115 }, 00:17:36.115 "claimed": false, 00:17:36.115 "zoned": false, 00:17:36.115 "supported_io_types": { 00:17:36.115 "read": true, 00:17:36.115 "write": true, 00:17:36.115 "unmap": true, 00:17:36.115 "flush": true, 00:17:36.115 "reset": true, 00:17:36.115 "nvme_admin": false, 00:17:36.115 "nvme_io": false, 00:17:36.115 "nvme_io_md": false, 00:17:36.115 "write_zeroes": true, 00:17:36.115 "zcopy": true, 00:17:36.115 "get_zone_info": false, 00:17:36.115 "zone_management": false, 00:17:36.115 "zone_append": false, 00:17:36.115 "compare": false, 00:17:36.115 "compare_and_write": false, 00:17:36.115 "abort": true, 00:17:36.115 "seek_hole": false, 00:17:36.115 "seek_data": false, 00:17:36.115 "copy": true, 00:17:36.115 "nvme_iov_md": false 00:17:36.115 }, 00:17:36.115 "memory_domains": [ 00:17:36.115 { 00:17:36.115 "dma_device_id": "system", 00:17:36.115 "dma_device_type": 1 00:17:36.115 }, 00:17:36.115 { 00:17:36.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.115 "dma_device_type": 2 00:17:36.115 } 00:17:36.115 ], 00:17:36.115 "driver_specific": {} 00:17:36.115 } 00:17:36.115 ] 00:17:36.115 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:36.115 17:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:36.115 17:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:36.115 17:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:36.376 BaseBdev3 00:17:36.376 17:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:36.376 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:36.376 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:36.376 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:36.376 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:36.376 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:36.376 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:36.636 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:36.636 [ 00:17:36.636 { 00:17:36.636 "name": "BaseBdev3", 00:17:36.636 "aliases": [ 00:17:36.636 "c2988524-0be6-43b7-bdb2-2b377060027f" 00:17:36.636 ], 00:17:36.636 "product_name": "Malloc disk", 00:17:36.636 "block_size": 512, 00:17:36.636 "num_blocks": 65536, 00:17:36.636 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:36.636 "assigned_rate_limits": { 00:17:36.636 "rw_ios_per_sec": 0, 00:17:36.636 "rw_mbytes_per_sec": 0, 00:17:36.636 "r_mbytes_per_sec": 0, 00:17:36.636 "w_mbytes_per_sec": 0 00:17:36.636 }, 00:17:36.636 "claimed": false, 00:17:36.636 "zoned": false, 00:17:36.636 "supported_io_types": { 00:17:36.636 "read": true, 00:17:36.636 "write": true, 00:17:36.636 "unmap": true, 00:17:36.636 "flush": true, 00:17:36.636 "reset": true, 00:17:36.636 "nvme_admin": false, 00:17:36.636 "nvme_io": false, 00:17:36.636 "nvme_io_md": false, 00:17:36.636 "write_zeroes": true, 00:17:36.636 "zcopy": true, 00:17:36.636 "get_zone_info": false, 00:17:36.636 "zone_management": false, 00:17:36.636 "zone_append": false, 00:17:36.636 "compare": false, 00:17:36.636 "compare_and_write": false, 00:17:36.636 "abort": true, 00:17:36.636 "seek_hole": false, 00:17:36.636 "seek_data": false, 00:17:36.636 "copy": true, 00:17:36.636 "nvme_iov_md": false 00:17:36.636 }, 00:17:36.636 "memory_domains": [ 00:17:36.636 { 00:17:36.636 "dma_device_id": "system", 00:17:36.636 "dma_device_type": 1 00:17:36.636 }, 00:17:36.636 { 00:17:36.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.636 "dma_device_type": 2 00:17:36.636 } 00:17:36.636 ], 00:17:36.636 "driver_specific": {} 00:17:36.636 } 00:17:36.636 ] 00:17:36.897 17:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:36.897 17:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:36.897 17:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:36.897 17:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:36.897 BaseBdev4 00:17:36.897 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:36.897 17:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:36.897 17:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:36.897 17:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:36.897 17:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:36.897 17:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:36.897 17:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.158 17:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:37.429 [ 00:17:37.429 { 00:17:37.429 "name": "BaseBdev4", 00:17:37.429 "aliases": [ 00:17:37.429 "14334876-7048-46ff-8d4d-03fbfd165d40" 00:17:37.429 ], 00:17:37.429 "product_name": "Malloc disk", 00:17:37.429 "block_size": 512, 00:17:37.429 "num_blocks": 65536, 00:17:37.429 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:37.429 "assigned_rate_limits": { 00:17:37.429 "rw_ios_per_sec": 0, 00:17:37.429 "rw_mbytes_per_sec": 0, 00:17:37.429 "r_mbytes_per_sec": 0, 00:17:37.429 "w_mbytes_per_sec": 0 00:17:37.429 }, 00:17:37.429 "claimed": false, 00:17:37.429 "zoned": false, 00:17:37.429 "supported_io_types": { 00:17:37.429 "read": true, 00:17:37.429 "write": true, 00:17:37.429 "unmap": true, 00:17:37.429 "flush": true, 00:17:37.429 "reset": true, 00:17:37.429 "nvme_admin": false, 00:17:37.429 "nvme_io": false, 00:17:37.429 "nvme_io_md": false, 00:17:37.429 "write_zeroes": true, 00:17:37.429 "zcopy": true, 00:17:37.429 "get_zone_info": false, 00:17:37.429 "zone_management": false, 00:17:37.429 "zone_append": false, 00:17:37.429 "compare": false, 00:17:37.429 "compare_and_write": false, 00:17:37.429 "abort": true, 00:17:37.429 "seek_hole": false, 00:17:37.429 "seek_data": false, 00:17:37.429 "copy": true, 00:17:37.429 "nvme_iov_md": false 00:17:37.429 }, 00:17:37.429 "memory_domains": [ 00:17:37.429 { 00:17:37.429 "dma_device_id": "system", 00:17:37.429 "dma_device_type": 1 00:17:37.429 }, 00:17:37.429 { 00:17:37.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.429 "dma_device_type": 2 00:17:37.429 } 00:17:37.429 ], 00:17:37.429 "driver_specific": {} 00:17:37.429 } 00:17:37.429 ] 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:37.429 [2024-07-15 17:29:48.691437] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:37.429 [2024-07-15 17:29:48.691468] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:37.429 [2024-07-15 17:29:48.691480] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:37.429 [2024-07-15 17:29:48.692515] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:37.429 [2024-07-15 17:29:48.692548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.429 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.689 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.689 "name": "Existed_Raid", 00:17:37.689 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.689 "strip_size_kb": 64, 00:17:37.689 "state": "configuring", 00:17:37.689 "raid_level": "concat", 00:17:37.689 "superblock": false, 00:17:37.689 "num_base_bdevs": 4, 00:17:37.689 "num_base_bdevs_discovered": 3, 00:17:37.689 "num_base_bdevs_operational": 4, 00:17:37.689 "base_bdevs_list": [ 00:17:37.689 { 00:17:37.689 "name": "BaseBdev1", 00:17:37.689 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.689 "is_configured": false, 00:17:37.689 "data_offset": 0, 00:17:37.689 "data_size": 0 00:17:37.689 }, 00:17:37.689 { 00:17:37.689 "name": "BaseBdev2", 00:17:37.689 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:37.689 "is_configured": true, 00:17:37.689 "data_offset": 0, 00:17:37.689 "data_size": 65536 00:17:37.689 }, 00:17:37.689 { 00:17:37.689 "name": "BaseBdev3", 00:17:37.689 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:37.689 "is_configured": true, 00:17:37.689 "data_offset": 0, 00:17:37.689 "data_size": 65536 00:17:37.689 }, 00:17:37.689 { 00:17:37.689 "name": "BaseBdev4", 00:17:37.689 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:37.689 "is_configured": true, 00:17:37.689 "data_offset": 0, 00:17:37.689 "data_size": 65536 00:17:37.689 } 00:17:37.689 ] 00:17:37.689 }' 00:17:37.689 17:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.689 17:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.261 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:38.522 [2024-07-15 17:29:49.625827] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.522 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.782 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.782 "name": "Existed_Raid", 00:17:38.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.782 "strip_size_kb": 64, 00:17:38.782 "state": "configuring", 00:17:38.782 "raid_level": "concat", 00:17:38.782 "superblock": false, 00:17:38.782 "num_base_bdevs": 4, 00:17:38.782 "num_base_bdevs_discovered": 2, 00:17:38.782 "num_base_bdevs_operational": 4, 00:17:38.782 "base_bdevs_list": [ 00:17:38.782 { 00:17:38.782 "name": "BaseBdev1", 00:17:38.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.782 "is_configured": false, 00:17:38.782 "data_offset": 0, 00:17:38.782 "data_size": 0 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "name": null, 00:17:38.782 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:38.782 "is_configured": false, 00:17:38.782 "data_offset": 0, 00:17:38.782 "data_size": 65536 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "name": "BaseBdev3", 00:17:38.782 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:38.782 "is_configured": true, 00:17:38.782 "data_offset": 0, 00:17:38.782 "data_size": 65536 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "name": "BaseBdev4", 00:17:38.782 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:38.782 "is_configured": true, 00:17:38.782 "data_offset": 0, 00:17:38.782 "data_size": 65536 00:17:38.782 } 00:17:38.782 ] 00:17:38.782 }' 00:17:38.782 17:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.783 17:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.354 17:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.354 17:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:39.354 17:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:39.354 17:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:39.615 [2024-07-15 17:29:50.753645] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:39.615 BaseBdev1 00:17:39.615 17:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:39.615 17:29:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:39.615 17:29:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:39.615 17:29:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:39.615 17:29:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:39.615 17:29:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:39.615 17:29:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:39.876 17:29:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:39.876 [ 00:17:39.876 { 00:17:39.876 "name": "BaseBdev1", 00:17:39.876 "aliases": [ 00:17:39.876 "fee8d97e-650c-4102-a610-fc052c44b197" 00:17:39.876 ], 00:17:39.876 "product_name": "Malloc disk", 00:17:39.876 "block_size": 512, 00:17:39.876 "num_blocks": 65536, 00:17:39.876 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:39.876 "assigned_rate_limits": { 00:17:39.876 "rw_ios_per_sec": 0, 00:17:39.876 "rw_mbytes_per_sec": 0, 00:17:39.876 "r_mbytes_per_sec": 0, 00:17:39.876 "w_mbytes_per_sec": 0 00:17:39.876 }, 00:17:39.876 "claimed": true, 00:17:39.876 "claim_type": "exclusive_write", 00:17:39.876 "zoned": false, 00:17:39.876 "supported_io_types": { 00:17:39.876 "read": true, 00:17:39.876 "write": true, 00:17:39.876 "unmap": true, 00:17:39.876 "flush": true, 00:17:39.876 "reset": true, 00:17:39.876 "nvme_admin": false, 00:17:39.876 "nvme_io": false, 00:17:39.876 "nvme_io_md": false, 00:17:39.876 "write_zeroes": true, 00:17:39.876 "zcopy": true, 00:17:39.876 "get_zone_info": false, 00:17:39.876 "zone_management": false, 00:17:39.876 "zone_append": false, 00:17:39.876 "compare": false, 00:17:39.876 "compare_and_write": false, 00:17:39.876 "abort": true, 00:17:39.876 "seek_hole": false, 00:17:39.876 "seek_data": false, 00:17:39.876 "copy": true, 00:17:39.876 "nvme_iov_md": false 00:17:39.876 }, 00:17:39.876 "memory_domains": [ 00:17:39.876 { 00:17:39.876 "dma_device_id": "system", 00:17:39.876 "dma_device_type": 1 00:17:39.876 }, 00:17:39.876 { 00:17:39.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.876 "dma_device_type": 2 00:17:39.876 } 00:17:39.876 ], 00:17:39.876 "driver_specific": {} 00:17:39.876 } 00:17:39.876 ] 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.876 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.138 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.138 "name": "Existed_Raid", 00:17:40.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.138 "strip_size_kb": 64, 00:17:40.138 "state": "configuring", 00:17:40.138 "raid_level": "concat", 00:17:40.138 "superblock": false, 00:17:40.138 "num_base_bdevs": 4, 00:17:40.138 "num_base_bdevs_discovered": 3, 00:17:40.138 "num_base_bdevs_operational": 4, 00:17:40.138 "base_bdevs_list": [ 00:17:40.138 { 00:17:40.138 "name": "BaseBdev1", 00:17:40.138 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:40.138 "is_configured": true, 00:17:40.138 "data_offset": 0, 00:17:40.138 "data_size": 65536 00:17:40.138 }, 00:17:40.138 { 00:17:40.138 "name": null, 00:17:40.138 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:40.138 "is_configured": false, 00:17:40.138 "data_offset": 0, 00:17:40.138 "data_size": 65536 00:17:40.138 }, 00:17:40.138 { 00:17:40.138 "name": "BaseBdev3", 00:17:40.138 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:40.138 "is_configured": true, 00:17:40.138 "data_offset": 0, 00:17:40.138 "data_size": 65536 00:17:40.138 }, 00:17:40.138 { 00:17:40.138 "name": "BaseBdev4", 00:17:40.138 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:40.138 "is_configured": true, 00:17:40.138 "data_offset": 0, 00:17:40.138 "data_size": 65536 00:17:40.138 } 00:17:40.138 ] 00:17:40.138 }' 00:17:40.138 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.138 17:29:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.707 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.707 17:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:40.967 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:40.967 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:41.227 [2024-07-15 17:29:52.265498] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.227 "name": "Existed_Raid", 00:17:41.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.227 "strip_size_kb": 64, 00:17:41.227 "state": "configuring", 00:17:41.227 "raid_level": "concat", 00:17:41.227 "superblock": false, 00:17:41.227 "num_base_bdevs": 4, 00:17:41.227 "num_base_bdevs_discovered": 2, 00:17:41.227 "num_base_bdevs_operational": 4, 00:17:41.227 "base_bdevs_list": [ 00:17:41.227 { 00:17:41.227 "name": "BaseBdev1", 00:17:41.227 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:41.227 "is_configured": true, 00:17:41.227 "data_offset": 0, 00:17:41.227 "data_size": 65536 00:17:41.227 }, 00:17:41.227 { 00:17:41.227 "name": null, 00:17:41.227 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:41.227 "is_configured": false, 00:17:41.227 "data_offset": 0, 00:17:41.227 "data_size": 65536 00:17:41.227 }, 00:17:41.227 { 00:17:41.227 "name": null, 00:17:41.227 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:41.227 "is_configured": false, 00:17:41.227 "data_offset": 0, 00:17:41.227 "data_size": 65536 00:17:41.227 }, 00:17:41.227 { 00:17:41.227 "name": "BaseBdev4", 00:17:41.227 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:41.227 "is_configured": true, 00:17:41.227 "data_offset": 0, 00:17:41.227 "data_size": 65536 00:17:41.227 } 00:17:41.227 ] 00:17:41.227 }' 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.227 17:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.798 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.798 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:42.063 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:42.063 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:42.326 [2024-07-15 17:29:53.404422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.326 "name": "Existed_Raid", 00:17:42.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.326 "strip_size_kb": 64, 00:17:42.326 "state": "configuring", 00:17:42.326 "raid_level": "concat", 00:17:42.326 "superblock": false, 00:17:42.326 "num_base_bdevs": 4, 00:17:42.326 "num_base_bdevs_discovered": 3, 00:17:42.326 "num_base_bdevs_operational": 4, 00:17:42.326 "base_bdevs_list": [ 00:17:42.326 { 00:17:42.326 "name": "BaseBdev1", 00:17:42.326 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:42.326 "is_configured": true, 00:17:42.326 "data_offset": 0, 00:17:42.326 "data_size": 65536 00:17:42.326 }, 00:17:42.326 { 00:17:42.326 "name": null, 00:17:42.326 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:42.326 "is_configured": false, 00:17:42.326 "data_offset": 0, 00:17:42.326 "data_size": 65536 00:17:42.326 }, 00:17:42.326 { 00:17:42.326 "name": "BaseBdev3", 00:17:42.326 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:42.326 "is_configured": true, 00:17:42.326 "data_offset": 0, 00:17:42.326 "data_size": 65536 00:17:42.326 }, 00:17:42.326 { 00:17:42.326 "name": "BaseBdev4", 00:17:42.326 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:42.326 "is_configured": true, 00:17:42.326 "data_offset": 0, 00:17:42.326 "data_size": 65536 00:17:42.326 } 00:17:42.326 ] 00:17:42.326 }' 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.326 17:29:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:42.895 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.895 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:43.154 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:43.154 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:43.724 [2024-07-15 17:29:54.852114] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.724 17:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:43.984 17:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:43.984 "name": "Existed_Raid", 00:17:43.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:43.984 "strip_size_kb": 64, 00:17:43.984 "state": "configuring", 00:17:43.984 "raid_level": "concat", 00:17:43.984 "superblock": false, 00:17:43.984 "num_base_bdevs": 4, 00:17:43.984 "num_base_bdevs_discovered": 2, 00:17:43.984 "num_base_bdevs_operational": 4, 00:17:43.984 "base_bdevs_list": [ 00:17:43.984 { 00:17:43.984 "name": null, 00:17:43.984 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:43.984 "is_configured": false, 00:17:43.984 "data_offset": 0, 00:17:43.984 "data_size": 65536 00:17:43.984 }, 00:17:43.984 { 00:17:43.984 "name": null, 00:17:43.984 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:43.984 "is_configured": false, 00:17:43.984 "data_offset": 0, 00:17:43.984 "data_size": 65536 00:17:43.984 }, 00:17:43.984 { 00:17:43.984 "name": "BaseBdev3", 00:17:43.984 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:43.984 "is_configured": true, 00:17:43.984 "data_offset": 0, 00:17:43.984 "data_size": 65536 00:17:43.984 }, 00:17:43.984 { 00:17:43.984 "name": "BaseBdev4", 00:17:43.984 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:43.984 "is_configured": true, 00:17:43.984 "data_offset": 0, 00:17:43.984 "data_size": 65536 00:17:43.984 } 00:17:43.984 ] 00:17:43.984 }' 00:17:43.984 17:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:43.984 17:29:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.553 17:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.553 17:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:44.553 17:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:44.553 17:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:44.859 [2024-07-15 17:29:55.992772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.859 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:45.125 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.125 "name": "Existed_Raid", 00:17:45.125 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.125 "strip_size_kb": 64, 00:17:45.125 "state": "configuring", 00:17:45.125 "raid_level": "concat", 00:17:45.125 "superblock": false, 00:17:45.125 "num_base_bdevs": 4, 00:17:45.125 "num_base_bdevs_discovered": 3, 00:17:45.125 "num_base_bdevs_operational": 4, 00:17:45.125 "base_bdevs_list": [ 00:17:45.125 { 00:17:45.125 "name": null, 00:17:45.125 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:45.125 "is_configured": false, 00:17:45.125 "data_offset": 0, 00:17:45.125 "data_size": 65536 00:17:45.125 }, 00:17:45.125 { 00:17:45.125 "name": "BaseBdev2", 00:17:45.125 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:45.125 "is_configured": true, 00:17:45.125 "data_offset": 0, 00:17:45.125 "data_size": 65536 00:17:45.125 }, 00:17:45.125 { 00:17:45.125 "name": "BaseBdev3", 00:17:45.125 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:45.125 "is_configured": true, 00:17:45.125 "data_offset": 0, 00:17:45.125 "data_size": 65536 00:17:45.125 }, 00:17:45.125 { 00:17:45.125 "name": "BaseBdev4", 00:17:45.125 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:45.125 "is_configured": true, 00:17:45.125 "data_offset": 0, 00:17:45.125 "data_size": 65536 00:17:45.125 } 00:17:45.125 ] 00:17:45.125 }' 00:17:45.125 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.125 17:29:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.696 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.696 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:45.696 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:45.696 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.696 17:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:45.955 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u fee8d97e-650c-4102-a610-fc052c44b197 00:17:46.215 [2024-07-15 17:29:57.345212] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:46.215 [2024-07-15 17:29:57.345238] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x195bba0 00:17:46.215 [2024-07-15 17:29:57.345242] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:46.215 [2024-07-15 17:29:57.345391] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1960bd0 00:17:46.215 [2024-07-15 17:29:57.345483] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x195bba0 00:17:46.215 [2024-07-15 17:29:57.345488] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x195bba0 00:17:46.215 [2024-07-15 17:29:57.345607] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:46.215 NewBaseBdev 00:17:46.215 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:46.215 17:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:46.215 17:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:46.215 17:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:46.215 17:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:46.215 17:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:46.215 17:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:46.475 [ 00:17:46.475 { 00:17:46.475 "name": "NewBaseBdev", 00:17:46.475 "aliases": [ 00:17:46.475 "fee8d97e-650c-4102-a610-fc052c44b197" 00:17:46.475 ], 00:17:46.475 "product_name": "Malloc disk", 00:17:46.475 "block_size": 512, 00:17:46.475 "num_blocks": 65536, 00:17:46.475 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:46.475 "assigned_rate_limits": { 00:17:46.475 "rw_ios_per_sec": 0, 00:17:46.475 "rw_mbytes_per_sec": 0, 00:17:46.475 "r_mbytes_per_sec": 0, 00:17:46.475 "w_mbytes_per_sec": 0 00:17:46.475 }, 00:17:46.475 "claimed": true, 00:17:46.475 "claim_type": "exclusive_write", 00:17:46.475 "zoned": false, 00:17:46.475 "supported_io_types": { 00:17:46.475 "read": true, 00:17:46.475 "write": true, 00:17:46.475 "unmap": true, 00:17:46.475 "flush": true, 00:17:46.475 "reset": true, 00:17:46.475 "nvme_admin": false, 00:17:46.475 "nvme_io": false, 00:17:46.475 "nvme_io_md": false, 00:17:46.475 "write_zeroes": true, 00:17:46.475 "zcopy": true, 00:17:46.475 "get_zone_info": false, 00:17:46.475 "zone_management": false, 00:17:46.475 "zone_append": false, 00:17:46.475 "compare": false, 00:17:46.475 "compare_and_write": false, 00:17:46.475 "abort": true, 00:17:46.475 "seek_hole": false, 00:17:46.475 "seek_data": false, 00:17:46.475 "copy": true, 00:17:46.475 "nvme_iov_md": false 00:17:46.475 }, 00:17:46.475 "memory_domains": [ 00:17:46.475 { 00:17:46.475 "dma_device_id": "system", 00:17:46.475 "dma_device_type": 1 00:17:46.475 }, 00:17:46.475 { 00:17:46.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.475 "dma_device_type": 2 00:17:46.475 } 00:17:46.475 ], 00:17:46.475 "driver_specific": {} 00:17:46.475 } 00:17:46.475 ] 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.475 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:46.735 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.735 "name": "Existed_Raid", 00:17:46.735 "uuid": "f6d8cf56-84e6-4f4c-95e9-f0d5a303d30d", 00:17:46.735 "strip_size_kb": 64, 00:17:46.735 "state": "online", 00:17:46.735 "raid_level": "concat", 00:17:46.735 "superblock": false, 00:17:46.735 "num_base_bdevs": 4, 00:17:46.735 "num_base_bdevs_discovered": 4, 00:17:46.735 "num_base_bdevs_operational": 4, 00:17:46.735 "base_bdevs_list": [ 00:17:46.735 { 00:17:46.735 "name": "NewBaseBdev", 00:17:46.735 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:46.735 "is_configured": true, 00:17:46.735 "data_offset": 0, 00:17:46.735 "data_size": 65536 00:17:46.735 }, 00:17:46.735 { 00:17:46.735 "name": "BaseBdev2", 00:17:46.735 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:46.735 "is_configured": true, 00:17:46.735 "data_offset": 0, 00:17:46.735 "data_size": 65536 00:17:46.735 }, 00:17:46.735 { 00:17:46.735 "name": "BaseBdev3", 00:17:46.735 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:46.735 "is_configured": true, 00:17:46.735 "data_offset": 0, 00:17:46.735 "data_size": 65536 00:17:46.735 }, 00:17:46.735 { 00:17:46.735 "name": "BaseBdev4", 00:17:46.735 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:46.735 "is_configured": true, 00:17:46.735 "data_offset": 0, 00:17:46.735 "data_size": 65536 00:17:46.735 } 00:17:46.735 ] 00:17:46.735 }' 00:17:46.735 17:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.735 17:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.306 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:47.306 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:47.306 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:47.306 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:47.306 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:47.306 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:47.306 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:47.306 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:47.630 [2024-07-15 17:29:58.668822] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:47.630 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:47.630 "name": "Existed_Raid", 00:17:47.630 "aliases": [ 00:17:47.630 "f6d8cf56-84e6-4f4c-95e9-f0d5a303d30d" 00:17:47.630 ], 00:17:47.630 "product_name": "Raid Volume", 00:17:47.630 "block_size": 512, 00:17:47.630 "num_blocks": 262144, 00:17:47.630 "uuid": "f6d8cf56-84e6-4f4c-95e9-f0d5a303d30d", 00:17:47.630 "assigned_rate_limits": { 00:17:47.630 "rw_ios_per_sec": 0, 00:17:47.630 "rw_mbytes_per_sec": 0, 00:17:47.630 "r_mbytes_per_sec": 0, 00:17:47.630 "w_mbytes_per_sec": 0 00:17:47.630 }, 00:17:47.630 "claimed": false, 00:17:47.630 "zoned": false, 00:17:47.630 "supported_io_types": { 00:17:47.630 "read": true, 00:17:47.630 "write": true, 00:17:47.630 "unmap": true, 00:17:47.630 "flush": true, 00:17:47.630 "reset": true, 00:17:47.630 "nvme_admin": false, 00:17:47.630 "nvme_io": false, 00:17:47.630 "nvme_io_md": false, 00:17:47.630 "write_zeroes": true, 00:17:47.630 "zcopy": false, 00:17:47.630 "get_zone_info": false, 00:17:47.630 "zone_management": false, 00:17:47.630 "zone_append": false, 00:17:47.630 "compare": false, 00:17:47.630 "compare_and_write": false, 00:17:47.630 "abort": false, 00:17:47.630 "seek_hole": false, 00:17:47.630 "seek_data": false, 00:17:47.630 "copy": false, 00:17:47.630 "nvme_iov_md": false 00:17:47.630 }, 00:17:47.630 "memory_domains": [ 00:17:47.630 { 00:17:47.630 "dma_device_id": "system", 00:17:47.630 "dma_device_type": 1 00:17:47.630 }, 00:17:47.630 { 00:17:47.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.630 "dma_device_type": 2 00:17:47.630 }, 00:17:47.630 { 00:17:47.630 "dma_device_id": "system", 00:17:47.630 "dma_device_type": 1 00:17:47.630 }, 00:17:47.630 { 00:17:47.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.630 "dma_device_type": 2 00:17:47.630 }, 00:17:47.630 { 00:17:47.630 "dma_device_id": "system", 00:17:47.631 "dma_device_type": 1 00:17:47.631 }, 00:17:47.631 { 00:17:47.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.631 "dma_device_type": 2 00:17:47.631 }, 00:17:47.631 { 00:17:47.631 "dma_device_id": "system", 00:17:47.631 "dma_device_type": 1 00:17:47.631 }, 00:17:47.631 { 00:17:47.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.631 "dma_device_type": 2 00:17:47.631 } 00:17:47.631 ], 00:17:47.631 "driver_specific": { 00:17:47.631 "raid": { 00:17:47.631 "uuid": "f6d8cf56-84e6-4f4c-95e9-f0d5a303d30d", 00:17:47.631 "strip_size_kb": 64, 00:17:47.631 "state": "online", 00:17:47.631 "raid_level": "concat", 00:17:47.631 "superblock": false, 00:17:47.631 "num_base_bdevs": 4, 00:17:47.631 "num_base_bdevs_discovered": 4, 00:17:47.631 "num_base_bdevs_operational": 4, 00:17:47.631 "base_bdevs_list": [ 00:17:47.631 { 00:17:47.631 "name": "NewBaseBdev", 00:17:47.631 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:47.631 "is_configured": true, 00:17:47.631 "data_offset": 0, 00:17:47.631 "data_size": 65536 00:17:47.631 }, 00:17:47.631 { 00:17:47.631 "name": "BaseBdev2", 00:17:47.631 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:47.631 "is_configured": true, 00:17:47.631 "data_offset": 0, 00:17:47.631 "data_size": 65536 00:17:47.631 }, 00:17:47.631 { 00:17:47.631 "name": "BaseBdev3", 00:17:47.631 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:47.631 "is_configured": true, 00:17:47.631 "data_offset": 0, 00:17:47.631 "data_size": 65536 00:17:47.631 }, 00:17:47.631 { 00:17:47.631 "name": "BaseBdev4", 00:17:47.631 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:47.631 "is_configured": true, 00:17:47.631 "data_offset": 0, 00:17:47.631 "data_size": 65536 00:17:47.631 } 00:17:47.631 ] 00:17:47.631 } 00:17:47.631 } 00:17:47.631 }' 00:17:47.631 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:47.631 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:47.631 BaseBdev2 00:17:47.631 BaseBdev3 00:17:47.631 BaseBdev4' 00:17:47.631 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.631 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:47.631 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.631 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.631 "name": "NewBaseBdev", 00:17:47.631 "aliases": [ 00:17:47.631 "fee8d97e-650c-4102-a610-fc052c44b197" 00:17:47.631 ], 00:17:47.631 "product_name": "Malloc disk", 00:17:47.631 "block_size": 512, 00:17:47.631 "num_blocks": 65536, 00:17:47.631 "uuid": "fee8d97e-650c-4102-a610-fc052c44b197", 00:17:47.631 "assigned_rate_limits": { 00:17:47.631 "rw_ios_per_sec": 0, 00:17:47.631 "rw_mbytes_per_sec": 0, 00:17:47.631 "r_mbytes_per_sec": 0, 00:17:47.631 "w_mbytes_per_sec": 0 00:17:47.631 }, 00:17:47.631 "claimed": true, 00:17:47.631 "claim_type": "exclusive_write", 00:17:47.631 "zoned": false, 00:17:47.631 "supported_io_types": { 00:17:47.631 "read": true, 00:17:47.631 "write": true, 00:17:47.631 "unmap": true, 00:17:47.631 "flush": true, 00:17:47.631 "reset": true, 00:17:47.631 "nvme_admin": false, 00:17:47.631 "nvme_io": false, 00:17:47.631 "nvme_io_md": false, 00:17:47.631 "write_zeroes": true, 00:17:47.631 "zcopy": true, 00:17:47.631 "get_zone_info": false, 00:17:47.631 "zone_management": false, 00:17:47.631 "zone_append": false, 00:17:47.631 "compare": false, 00:17:47.631 "compare_and_write": false, 00:17:47.631 "abort": true, 00:17:47.631 "seek_hole": false, 00:17:47.631 "seek_data": false, 00:17:47.631 "copy": true, 00:17:47.631 "nvme_iov_md": false 00:17:47.631 }, 00:17:47.631 "memory_domains": [ 00:17:47.631 { 00:17:47.631 "dma_device_id": "system", 00:17:47.631 "dma_device_type": 1 00:17:47.631 }, 00:17:47.631 { 00:17:47.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.631 "dma_device_type": 2 00:17:47.631 } 00:17:47.631 ], 00:17:47.631 "driver_specific": {} 00:17:47.631 }' 00:17:47.631 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.890 17:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.890 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.890 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.890 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.890 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.890 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.890 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.890 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.890 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.149 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.149 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.149 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:48.149 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.149 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.409 "name": "BaseBdev2", 00:17:48.409 "aliases": [ 00:17:48.409 "d06ca452-3355-4090-93e1-a1d0e8900477" 00:17:48.409 ], 00:17:48.409 "product_name": "Malloc disk", 00:17:48.409 "block_size": 512, 00:17:48.409 "num_blocks": 65536, 00:17:48.409 "uuid": "d06ca452-3355-4090-93e1-a1d0e8900477", 00:17:48.409 "assigned_rate_limits": { 00:17:48.409 "rw_ios_per_sec": 0, 00:17:48.409 "rw_mbytes_per_sec": 0, 00:17:48.409 "r_mbytes_per_sec": 0, 00:17:48.409 "w_mbytes_per_sec": 0 00:17:48.409 }, 00:17:48.409 "claimed": true, 00:17:48.409 "claim_type": "exclusive_write", 00:17:48.409 "zoned": false, 00:17:48.409 "supported_io_types": { 00:17:48.409 "read": true, 00:17:48.409 "write": true, 00:17:48.409 "unmap": true, 00:17:48.409 "flush": true, 00:17:48.409 "reset": true, 00:17:48.409 "nvme_admin": false, 00:17:48.409 "nvme_io": false, 00:17:48.409 "nvme_io_md": false, 00:17:48.409 "write_zeroes": true, 00:17:48.409 "zcopy": true, 00:17:48.409 "get_zone_info": false, 00:17:48.409 "zone_management": false, 00:17:48.409 "zone_append": false, 00:17:48.409 "compare": false, 00:17:48.409 "compare_and_write": false, 00:17:48.409 "abort": true, 00:17:48.409 "seek_hole": false, 00:17:48.409 "seek_data": false, 00:17:48.409 "copy": true, 00:17:48.409 "nvme_iov_md": false 00:17:48.409 }, 00:17:48.409 "memory_domains": [ 00:17:48.409 { 00:17:48.409 "dma_device_id": "system", 00:17:48.409 "dma_device_type": 1 00:17:48.409 }, 00:17:48.409 { 00:17:48.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.409 "dma_device_type": 2 00:17:48.409 } 00:17:48.409 ], 00:17:48.409 "driver_specific": {} 00:17:48.409 }' 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.409 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.668 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.668 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.668 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.668 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:48.668 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.668 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:48.927 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.927 "name": "BaseBdev3", 00:17:48.927 "aliases": [ 00:17:48.927 "c2988524-0be6-43b7-bdb2-2b377060027f" 00:17:48.927 ], 00:17:48.927 "product_name": "Malloc disk", 00:17:48.927 "block_size": 512, 00:17:48.927 "num_blocks": 65536, 00:17:48.927 "uuid": "c2988524-0be6-43b7-bdb2-2b377060027f", 00:17:48.927 "assigned_rate_limits": { 00:17:48.927 "rw_ios_per_sec": 0, 00:17:48.927 "rw_mbytes_per_sec": 0, 00:17:48.927 "r_mbytes_per_sec": 0, 00:17:48.927 "w_mbytes_per_sec": 0 00:17:48.927 }, 00:17:48.927 "claimed": true, 00:17:48.927 "claim_type": "exclusive_write", 00:17:48.927 "zoned": false, 00:17:48.927 "supported_io_types": { 00:17:48.927 "read": true, 00:17:48.927 "write": true, 00:17:48.927 "unmap": true, 00:17:48.927 "flush": true, 00:17:48.927 "reset": true, 00:17:48.927 "nvme_admin": false, 00:17:48.927 "nvme_io": false, 00:17:48.927 "nvme_io_md": false, 00:17:48.927 "write_zeroes": true, 00:17:48.927 "zcopy": true, 00:17:48.927 "get_zone_info": false, 00:17:48.927 "zone_management": false, 00:17:48.927 "zone_append": false, 00:17:48.927 "compare": false, 00:17:48.927 "compare_and_write": false, 00:17:48.927 "abort": true, 00:17:48.927 "seek_hole": false, 00:17:48.927 "seek_data": false, 00:17:48.927 "copy": true, 00:17:48.927 "nvme_iov_md": false 00:17:48.927 }, 00:17:48.927 "memory_domains": [ 00:17:48.927 { 00:17:48.927 "dma_device_id": "system", 00:17:48.927 "dma_device_type": 1 00:17:48.927 }, 00:17:48.927 { 00:17:48.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.927 "dma_device_type": 2 00:17:48.927 } 00:17:48.927 ], 00:17:48.927 "driver_specific": {} 00:17:48.927 }' 00:17:48.927 17:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.927 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.927 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.927 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.927 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.927 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.927 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.927 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.186 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:49.186 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.186 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.186 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.186 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:49.186 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:49.186 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:49.444 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:49.444 "name": "BaseBdev4", 00:17:49.444 "aliases": [ 00:17:49.444 "14334876-7048-46ff-8d4d-03fbfd165d40" 00:17:49.444 ], 00:17:49.444 "product_name": "Malloc disk", 00:17:49.444 "block_size": 512, 00:17:49.444 "num_blocks": 65536, 00:17:49.444 "uuid": "14334876-7048-46ff-8d4d-03fbfd165d40", 00:17:49.444 "assigned_rate_limits": { 00:17:49.444 "rw_ios_per_sec": 0, 00:17:49.444 "rw_mbytes_per_sec": 0, 00:17:49.444 "r_mbytes_per_sec": 0, 00:17:49.444 "w_mbytes_per_sec": 0 00:17:49.444 }, 00:17:49.444 "claimed": true, 00:17:49.444 "claim_type": "exclusive_write", 00:17:49.444 "zoned": false, 00:17:49.444 "supported_io_types": { 00:17:49.444 "read": true, 00:17:49.444 "write": true, 00:17:49.444 "unmap": true, 00:17:49.444 "flush": true, 00:17:49.444 "reset": true, 00:17:49.444 "nvme_admin": false, 00:17:49.444 "nvme_io": false, 00:17:49.444 "nvme_io_md": false, 00:17:49.444 "write_zeroes": true, 00:17:49.444 "zcopy": true, 00:17:49.444 "get_zone_info": false, 00:17:49.444 "zone_management": false, 00:17:49.444 "zone_append": false, 00:17:49.444 "compare": false, 00:17:49.444 "compare_and_write": false, 00:17:49.444 "abort": true, 00:17:49.444 "seek_hole": false, 00:17:49.444 "seek_data": false, 00:17:49.444 "copy": true, 00:17:49.444 "nvme_iov_md": false 00:17:49.444 }, 00:17:49.444 "memory_domains": [ 00:17:49.444 { 00:17:49.444 "dma_device_id": "system", 00:17:49.444 "dma_device_type": 1 00:17:49.444 }, 00:17:49.444 { 00:17:49.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.444 "dma_device_type": 2 00:17:49.444 } 00:17:49.444 ], 00:17:49.444 "driver_specific": {} 00:17:49.444 }' 00:17:49.444 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:49.444 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:49.444 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:49.444 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.444 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.444 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:49.445 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.704 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.704 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:49.704 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.704 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.704 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.704 17:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:49.963 [2024-07-15 17:30:01.058621] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:49.963 [2024-07-15 17:30:01.058640] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:49.963 [2024-07-15 17:30:01.058675] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:49.963 [2024-07-15 17:30:01.058722] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:49.963 [2024-07-15 17:30:01.058733] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195bba0 name Existed_Raid, state offline 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2822421 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2822421 ']' 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2822421 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2822421 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2822421' 00:17:49.963 killing process with pid 2822421 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2822421 00:17:49.963 [2024-07-15 17:30:01.130839] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:49.963 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2822421 00:17:49.963 [2024-07-15 17:30:01.151234] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:50.223 00:17:50.223 real 0m27.520s 00:17:50.223 user 0m51.615s 00:17:50.223 sys 0m4.065s 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.223 ************************************ 00:17:50.223 END TEST raid_state_function_test 00:17:50.223 ************************************ 00:17:50.223 17:30:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:50.223 17:30:01 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:17:50.223 17:30:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:50.223 17:30:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:50.223 17:30:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:50.223 ************************************ 00:17:50.223 START TEST raid_state_function_test_sb 00:17:50.223 ************************************ 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:50.223 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2827762 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2827762' 00:17:50.224 Process raid pid: 2827762 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2827762 /var/tmp/spdk-raid.sock 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2827762 ']' 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:50.224 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:50.224 17:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:50.224 [2024-07-15 17:30:01.427666] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:17:50.224 [2024-07-15 17:30:01.427748] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:50.224 [2024-07-15 17:30:01.518303] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:50.482 [2024-07-15 17:30:01.588000] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:50.482 [2024-07-15 17:30:01.641354] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:50.482 [2024-07-15 17:30:01.641379] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:51.050 17:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:51.050 17:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:51.050 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:51.309 [2024-07-15 17:30:02.429575] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:51.309 [2024-07-15 17:30:02.429606] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:51.309 [2024-07-15 17:30:02.429613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:51.309 [2024-07-15 17:30:02.429619] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:51.309 [2024-07-15 17:30:02.429623] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:51.309 [2024-07-15 17:30:02.429629] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:51.309 [2024-07-15 17:30:02.429633] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:51.309 [2024-07-15 17:30:02.429638] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.309 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:51.568 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.568 "name": "Existed_Raid", 00:17:51.568 "uuid": "569e82a6-1251-4c76-b001-053874fc57f9", 00:17:51.568 "strip_size_kb": 64, 00:17:51.568 "state": "configuring", 00:17:51.568 "raid_level": "concat", 00:17:51.568 "superblock": true, 00:17:51.568 "num_base_bdevs": 4, 00:17:51.568 "num_base_bdevs_discovered": 0, 00:17:51.568 "num_base_bdevs_operational": 4, 00:17:51.568 "base_bdevs_list": [ 00:17:51.568 { 00:17:51.568 "name": "BaseBdev1", 00:17:51.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.568 "is_configured": false, 00:17:51.568 "data_offset": 0, 00:17:51.568 "data_size": 0 00:17:51.568 }, 00:17:51.568 { 00:17:51.568 "name": "BaseBdev2", 00:17:51.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.568 "is_configured": false, 00:17:51.569 "data_offset": 0, 00:17:51.569 "data_size": 0 00:17:51.569 }, 00:17:51.569 { 00:17:51.569 "name": "BaseBdev3", 00:17:51.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.569 "is_configured": false, 00:17:51.569 "data_offset": 0, 00:17:51.569 "data_size": 0 00:17:51.569 }, 00:17:51.569 { 00:17:51.569 "name": "BaseBdev4", 00:17:51.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.569 "is_configured": false, 00:17:51.569 "data_offset": 0, 00:17:51.569 "data_size": 0 00:17:51.569 } 00:17:51.569 ] 00:17:51.569 }' 00:17:51.569 17:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.569 17:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:52.138 17:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:52.138 [2024-07-15 17:30:03.419962] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:52.138 [2024-07-15 17:30:03.419981] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c46f0 name Existed_Raid, state configuring 00:17:52.398 17:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:52.398 [2024-07-15 17:30:03.616476] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:52.398 [2024-07-15 17:30:03.616491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:52.398 [2024-07-15 17:30:03.616496] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:52.398 [2024-07-15 17:30:03.616501] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:52.398 [2024-07-15 17:30:03.616506] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:52.398 [2024-07-15 17:30:03.616512] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:52.398 [2024-07-15 17:30:03.616516] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:52.398 [2024-07-15 17:30:03.616521] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:52.398 17:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:52.658 [2024-07-15 17:30:03.807538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:52.658 BaseBdev1 00:17:52.658 17:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:52.658 17:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:52.658 17:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:52.658 17:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:52.658 17:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:52.658 17:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:52.658 17:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.918 17:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:52.918 [ 00:17:52.918 { 00:17:52.918 "name": "BaseBdev1", 00:17:52.918 "aliases": [ 00:17:52.918 "6011b881-e17e-4f89-9245-aec10eb86d2b" 00:17:52.918 ], 00:17:52.918 "product_name": "Malloc disk", 00:17:52.918 "block_size": 512, 00:17:52.918 "num_blocks": 65536, 00:17:52.918 "uuid": "6011b881-e17e-4f89-9245-aec10eb86d2b", 00:17:52.918 "assigned_rate_limits": { 00:17:52.918 "rw_ios_per_sec": 0, 00:17:52.918 "rw_mbytes_per_sec": 0, 00:17:52.918 "r_mbytes_per_sec": 0, 00:17:52.918 "w_mbytes_per_sec": 0 00:17:52.918 }, 00:17:52.919 "claimed": true, 00:17:52.919 "claim_type": "exclusive_write", 00:17:52.919 "zoned": false, 00:17:52.919 "supported_io_types": { 00:17:52.919 "read": true, 00:17:52.919 "write": true, 00:17:52.919 "unmap": true, 00:17:52.919 "flush": true, 00:17:52.919 "reset": true, 00:17:52.919 "nvme_admin": false, 00:17:52.919 "nvme_io": false, 00:17:52.919 "nvme_io_md": false, 00:17:52.919 "write_zeroes": true, 00:17:52.919 "zcopy": true, 00:17:52.919 "get_zone_info": false, 00:17:52.919 "zone_management": false, 00:17:52.919 "zone_append": false, 00:17:52.919 "compare": false, 00:17:52.919 "compare_and_write": false, 00:17:52.919 "abort": true, 00:17:52.919 "seek_hole": false, 00:17:52.919 "seek_data": false, 00:17:52.919 "copy": true, 00:17:52.919 "nvme_iov_md": false 00:17:52.919 }, 00:17:52.919 "memory_domains": [ 00:17:52.919 { 00:17:52.919 "dma_device_id": "system", 00:17:52.919 "dma_device_type": 1 00:17:52.919 }, 00:17:52.919 { 00:17:52.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.919 "dma_device_type": 2 00:17:52.919 } 00:17:52.919 ], 00:17:52.919 "driver_specific": {} 00:17:52.919 } 00:17:52.919 ] 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.919 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.178 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.178 "name": "Existed_Raid", 00:17:53.178 "uuid": "aee629fe-fae9-4f15-8052-94e702687f02", 00:17:53.178 "strip_size_kb": 64, 00:17:53.178 "state": "configuring", 00:17:53.178 "raid_level": "concat", 00:17:53.178 "superblock": true, 00:17:53.178 "num_base_bdevs": 4, 00:17:53.178 "num_base_bdevs_discovered": 1, 00:17:53.178 "num_base_bdevs_operational": 4, 00:17:53.178 "base_bdevs_list": [ 00:17:53.178 { 00:17:53.178 "name": "BaseBdev1", 00:17:53.178 "uuid": "6011b881-e17e-4f89-9245-aec10eb86d2b", 00:17:53.178 "is_configured": true, 00:17:53.178 "data_offset": 2048, 00:17:53.178 "data_size": 63488 00:17:53.178 }, 00:17:53.178 { 00:17:53.178 "name": "BaseBdev2", 00:17:53.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.178 "is_configured": false, 00:17:53.178 "data_offset": 0, 00:17:53.178 "data_size": 0 00:17:53.178 }, 00:17:53.178 { 00:17:53.178 "name": "BaseBdev3", 00:17:53.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.178 "is_configured": false, 00:17:53.178 "data_offset": 0, 00:17:53.178 "data_size": 0 00:17:53.178 }, 00:17:53.178 { 00:17:53.178 "name": "BaseBdev4", 00:17:53.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.178 "is_configured": false, 00:17:53.178 "data_offset": 0, 00:17:53.178 "data_size": 0 00:17:53.178 } 00:17:53.178 ] 00:17:53.178 }' 00:17:53.178 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.178 17:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:53.748 17:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:54.007 [2024-07-15 17:30:05.122861] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:54.007 [2024-07-15 17:30:05.122886] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c3f60 name Existed_Raid, state configuring 00:17:54.007 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:54.267 [2024-07-15 17:30:05.307363] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:54.267 [2024-07-15 17:30:05.308451] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:54.267 [2024-07-15 17:30:05.308473] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:54.267 [2024-07-15 17:30:05.308479] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:54.267 [2024-07-15 17:30:05.308485] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:54.267 [2024-07-15 17:30:05.308490] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:54.267 [2024-07-15 17:30:05.308495] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.267 "name": "Existed_Raid", 00:17:54.267 "uuid": "8a7fae25-5859-44e6-9abc-0deea6928e9e", 00:17:54.267 "strip_size_kb": 64, 00:17:54.267 "state": "configuring", 00:17:54.267 "raid_level": "concat", 00:17:54.267 "superblock": true, 00:17:54.267 "num_base_bdevs": 4, 00:17:54.267 "num_base_bdevs_discovered": 1, 00:17:54.267 "num_base_bdevs_operational": 4, 00:17:54.267 "base_bdevs_list": [ 00:17:54.267 { 00:17:54.267 "name": "BaseBdev1", 00:17:54.267 "uuid": "6011b881-e17e-4f89-9245-aec10eb86d2b", 00:17:54.267 "is_configured": true, 00:17:54.267 "data_offset": 2048, 00:17:54.267 "data_size": 63488 00:17:54.267 }, 00:17:54.267 { 00:17:54.267 "name": "BaseBdev2", 00:17:54.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.267 "is_configured": false, 00:17:54.267 "data_offset": 0, 00:17:54.267 "data_size": 0 00:17:54.267 }, 00:17:54.267 { 00:17:54.267 "name": "BaseBdev3", 00:17:54.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.267 "is_configured": false, 00:17:54.267 "data_offset": 0, 00:17:54.267 "data_size": 0 00:17:54.267 }, 00:17:54.267 { 00:17:54.267 "name": "BaseBdev4", 00:17:54.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.267 "is_configured": false, 00:17:54.267 "data_offset": 0, 00:17:54.267 "data_size": 0 00:17:54.267 } 00:17:54.267 ] 00:17:54.267 }' 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.267 17:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:54.837 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:55.098 [2024-07-15 17:30:06.242441] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:55.098 BaseBdev2 00:17:55.098 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:55.098 17:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:55.098 17:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:55.098 17:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:55.098 17:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:55.098 17:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:55.098 17:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:55.359 [ 00:17:55.359 { 00:17:55.359 "name": "BaseBdev2", 00:17:55.359 "aliases": [ 00:17:55.359 "7d21829f-be90-453a-883f-121ab343cc75" 00:17:55.359 ], 00:17:55.359 "product_name": "Malloc disk", 00:17:55.359 "block_size": 512, 00:17:55.359 "num_blocks": 65536, 00:17:55.359 "uuid": "7d21829f-be90-453a-883f-121ab343cc75", 00:17:55.359 "assigned_rate_limits": { 00:17:55.359 "rw_ios_per_sec": 0, 00:17:55.359 "rw_mbytes_per_sec": 0, 00:17:55.359 "r_mbytes_per_sec": 0, 00:17:55.359 "w_mbytes_per_sec": 0 00:17:55.359 }, 00:17:55.359 "claimed": true, 00:17:55.359 "claim_type": "exclusive_write", 00:17:55.359 "zoned": false, 00:17:55.359 "supported_io_types": { 00:17:55.359 "read": true, 00:17:55.359 "write": true, 00:17:55.359 "unmap": true, 00:17:55.359 "flush": true, 00:17:55.359 "reset": true, 00:17:55.359 "nvme_admin": false, 00:17:55.359 "nvme_io": false, 00:17:55.359 "nvme_io_md": false, 00:17:55.359 "write_zeroes": true, 00:17:55.359 "zcopy": true, 00:17:55.359 "get_zone_info": false, 00:17:55.359 "zone_management": false, 00:17:55.359 "zone_append": false, 00:17:55.359 "compare": false, 00:17:55.359 "compare_and_write": false, 00:17:55.359 "abort": true, 00:17:55.359 "seek_hole": false, 00:17:55.359 "seek_data": false, 00:17:55.359 "copy": true, 00:17:55.359 "nvme_iov_md": false 00:17:55.359 }, 00:17:55.359 "memory_domains": [ 00:17:55.359 { 00:17:55.359 "dma_device_id": "system", 00:17:55.359 "dma_device_type": 1 00:17:55.359 }, 00:17:55.359 { 00:17:55.359 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:55.359 "dma_device_type": 2 00:17:55.359 } 00:17:55.359 ], 00:17:55.359 "driver_specific": {} 00:17:55.359 } 00:17:55.359 ] 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.359 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.619 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.619 "name": "Existed_Raid", 00:17:55.619 "uuid": "8a7fae25-5859-44e6-9abc-0deea6928e9e", 00:17:55.619 "strip_size_kb": 64, 00:17:55.619 "state": "configuring", 00:17:55.619 "raid_level": "concat", 00:17:55.619 "superblock": true, 00:17:55.619 "num_base_bdevs": 4, 00:17:55.619 "num_base_bdevs_discovered": 2, 00:17:55.619 "num_base_bdevs_operational": 4, 00:17:55.619 "base_bdevs_list": [ 00:17:55.619 { 00:17:55.619 "name": "BaseBdev1", 00:17:55.619 "uuid": "6011b881-e17e-4f89-9245-aec10eb86d2b", 00:17:55.619 "is_configured": true, 00:17:55.619 "data_offset": 2048, 00:17:55.619 "data_size": 63488 00:17:55.619 }, 00:17:55.619 { 00:17:55.619 "name": "BaseBdev2", 00:17:55.619 "uuid": "7d21829f-be90-453a-883f-121ab343cc75", 00:17:55.619 "is_configured": true, 00:17:55.619 "data_offset": 2048, 00:17:55.619 "data_size": 63488 00:17:55.619 }, 00:17:55.619 { 00:17:55.619 "name": "BaseBdev3", 00:17:55.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.619 "is_configured": false, 00:17:55.619 "data_offset": 0, 00:17:55.619 "data_size": 0 00:17:55.619 }, 00:17:55.619 { 00:17:55.619 "name": "BaseBdev4", 00:17:55.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.619 "is_configured": false, 00:17:55.619 "data_offset": 0, 00:17:55.619 "data_size": 0 00:17:55.619 } 00:17:55.619 ] 00:17:55.619 }' 00:17:55.619 17:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.619 17:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.189 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:56.449 [2024-07-15 17:30:07.566650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:56.449 BaseBdev3 00:17:56.449 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:56.449 17:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:56.449 17:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:56.449 17:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:56.449 17:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:56.449 17:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:56.449 17:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:56.710 [ 00:17:56.710 { 00:17:56.710 "name": "BaseBdev3", 00:17:56.710 "aliases": [ 00:17:56.710 "26ba2e77-b13a-461c-b2c5-e4a2784f3063" 00:17:56.710 ], 00:17:56.710 "product_name": "Malloc disk", 00:17:56.710 "block_size": 512, 00:17:56.710 "num_blocks": 65536, 00:17:56.710 "uuid": "26ba2e77-b13a-461c-b2c5-e4a2784f3063", 00:17:56.710 "assigned_rate_limits": { 00:17:56.710 "rw_ios_per_sec": 0, 00:17:56.710 "rw_mbytes_per_sec": 0, 00:17:56.710 "r_mbytes_per_sec": 0, 00:17:56.710 "w_mbytes_per_sec": 0 00:17:56.710 }, 00:17:56.710 "claimed": true, 00:17:56.710 "claim_type": "exclusive_write", 00:17:56.710 "zoned": false, 00:17:56.710 "supported_io_types": { 00:17:56.710 "read": true, 00:17:56.710 "write": true, 00:17:56.710 "unmap": true, 00:17:56.710 "flush": true, 00:17:56.710 "reset": true, 00:17:56.710 "nvme_admin": false, 00:17:56.710 "nvme_io": false, 00:17:56.710 "nvme_io_md": false, 00:17:56.710 "write_zeroes": true, 00:17:56.710 "zcopy": true, 00:17:56.710 "get_zone_info": false, 00:17:56.710 "zone_management": false, 00:17:56.710 "zone_append": false, 00:17:56.710 "compare": false, 00:17:56.710 "compare_and_write": false, 00:17:56.710 "abort": true, 00:17:56.710 "seek_hole": false, 00:17:56.710 "seek_data": false, 00:17:56.710 "copy": true, 00:17:56.710 "nvme_iov_md": false 00:17:56.710 }, 00:17:56.710 "memory_domains": [ 00:17:56.710 { 00:17:56.710 "dma_device_id": "system", 00:17:56.710 "dma_device_type": 1 00:17:56.710 }, 00:17:56.710 { 00:17:56.710 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.710 "dma_device_type": 2 00:17:56.710 } 00:17:56.710 ], 00:17:56.710 "driver_specific": {} 00:17:56.710 } 00:17:56.710 ] 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.710 17:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.970 17:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.970 "name": "Existed_Raid", 00:17:56.970 "uuid": "8a7fae25-5859-44e6-9abc-0deea6928e9e", 00:17:56.970 "strip_size_kb": 64, 00:17:56.970 "state": "configuring", 00:17:56.970 "raid_level": "concat", 00:17:56.970 "superblock": true, 00:17:56.970 "num_base_bdevs": 4, 00:17:56.970 "num_base_bdevs_discovered": 3, 00:17:56.970 "num_base_bdevs_operational": 4, 00:17:56.970 "base_bdevs_list": [ 00:17:56.970 { 00:17:56.970 "name": "BaseBdev1", 00:17:56.970 "uuid": "6011b881-e17e-4f89-9245-aec10eb86d2b", 00:17:56.970 "is_configured": true, 00:17:56.970 "data_offset": 2048, 00:17:56.970 "data_size": 63488 00:17:56.970 }, 00:17:56.970 { 00:17:56.970 "name": "BaseBdev2", 00:17:56.970 "uuid": "7d21829f-be90-453a-883f-121ab343cc75", 00:17:56.971 "is_configured": true, 00:17:56.971 "data_offset": 2048, 00:17:56.971 "data_size": 63488 00:17:56.971 }, 00:17:56.971 { 00:17:56.971 "name": "BaseBdev3", 00:17:56.971 "uuid": "26ba2e77-b13a-461c-b2c5-e4a2784f3063", 00:17:56.971 "is_configured": true, 00:17:56.971 "data_offset": 2048, 00:17:56.971 "data_size": 63488 00:17:56.971 }, 00:17:56.971 { 00:17:56.971 "name": "BaseBdev4", 00:17:56.971 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.971 "is_configured": false, 00:17:56.971 "data_offset": 0, 00:17:56.971 "data_size": 0 00:17:56.971 } 00:17:56.971 ] 00:17:56.971 }' 00:17:56.971 17:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.971 17:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:57.541 17:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:57.801 [2024-07-15 17:30:08.886729] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:57.801 [2024-07-15 17:30:08.886855] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c4fc0 00:17:57.801 [2024-07-15 17:30:08.886863] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:57.801 [2024-07-15 17:30:08.887000] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26c4c00 00:17:57.801 [2024-07-15 17:30:08.887093] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c4fc0 00:17:57.801 [2024-07-15 17:30:08.887099] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x26c4fc0 00:17:57.801 [2024-07-15 17:30:08.887167] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:57.801 BaseBdev4 00:17:57.801 17:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:57.801 17:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:57.801 17:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:57.801 17:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:57.801 17:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:57.801 17:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:57.801 17:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:57.801 17:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:58.061 [ 00:17:58.061 { 00:17:58.061 "name": "BaseBdev4", 00:17:58.061 "aliases": [ 00:17:58.061 "48b009b9-0751-4109-9cdc-9f00a7e6909c" 00:17:58.061 ], 00:17:58.061 "product_name": "Malloc disk", 00:17:58.061 "block_size": 512, 00:17:58.061 "num_blocks": 65536, 00:17:58.061 "uuid": "48b009b9-0751-4109-9cdc-9f00a7e6909c", 00:17:58.061 "assigned_rate_limits": { 00:17:58.061 "rw_ios_per_sec": 0, 00:17:58.061 "rw_mbytes_per_sec": 0, 00:17:58.061 "r_mbytes_per_sec": 0, 00:17:58.061 "w_mbytes_per_sec": 0 00:17:58.061 }, 00:17:58.061 "claimed": true, 00:17:58.061 "claim_type": "exclusive_write", 00:17:58.061 "zoned": false, 00:17:58.061 "supported_io_types": { 00:17:58.061 "read": true, 00:17:58.061 "write": true, 00:17:58.061 "unmap": true, 00:17:58.061 "flush": true, 00:17:58.061 "reset": true, 00:17:58.061 "nvme_admin": false, 00:17:58.061 "nvme_io": false, 00:17:58.061 "nvme_io_md": false, 00:17:58.061 "write_zeroes": true, 00:17:58.061 "zcopy": true, 00:17:58.061 "get_zone_info": false, 00:17:58.061 "zone_management": false, 00:17:58.061 "zone_append": false, 00:17:58.061 "compare": false, 00:17:58.061 "compare_and_write": false, 00:17:58.061 "abort": true, 00:17:58.061 "seek_hole": false, 00:17:58.061 "seek_data": false, 00:17:58.061 "copy": true, 00:17:58.061 "nvme_iov_md": false 00:17:58.061 }, 00:17:58.061 "memory_domains": [ 00:17:58.061 { 00:17:58.061 "dma_device_id": "system", 00:17:58.061 "dma_device_type": 1 00:17:58.061 }, 00:17:58.061 { 00:17:58.061 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.061 "dma_device_type": 2 00:17:58.061 } 00:17:58.061 ], 00:17:58.061 "driver_specific": {} 00:17:58.061 } 00:17:58.061 ] 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.061 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.322 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.322 "name": "Existed_Raid", 00:17:58.322 "uuid": "8a7fae25-5859-44e6-9abc-0deea6928e9e", 00:17:58.322 "strip_size_kb": 64, 00:17:58.322 "state": "online", 00:17:58.322 "raid_level": "concat", 00:17:58.322 "superblock": true, 00:17:58.322 "num_base_bdevs": 4, 00:17:58.322 "num_base_bdevs_discovered": 4, 00:17:58.322 "num_base_bdevs_operational": 4, 00:17:58.322 "base_bdevs_list": [ 00:17:58.322 { 00:17:58.322 "name": "BaseBdev1", 00:17:58.322 "uuid": "6011b881-e17e-4f89-9245-aec10eb86d2b", 00:17:58.322 "is_configured": true, 00:17:58.322 "data_offset": 2048, 00:17:58.322 "data_size": 63488 00:17:58.322 }, 00:17:58.322 { 00:17:58.322 "name": "BaseBdev2", 00:17:58.322 "uuid": "7d21829f-be90-453a-883f-121ab343cc75", 00:17:58.322 "is_configured": true, 00:17:58.322 "data_offset": 2048, 00:17:58.322 "data_size": 63488 00:17:58.322 }, 00:17:58.322 { 00:17:58.322 "name": "BaseBdev3", 00:17:58.322 "uuid": "26ba2e77-b13a-461c-b2c5-e4a2784f3063", 00:17:58.322 "is_configured": true, 00:17:58.322 "data_offset": 2048, 00:17:58.322 "data_size": 63488 00:17:58.322 }, 00:17:58.322 { 00:17:58.322 "name": "BaseBdev4", 00:17:58.322 "uuid": "48b009b9-0751-4109-9cdc-9f00a7e6909c", 00:17:58.322 "is_configured": true, 00:17:58.322 "data_offset": 2048, 00:17:58.322 "data_size": 63488 00:17:58.322 } 00:17:58.322 ] 00:17:58.322 }' 00:17:58.322 17:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.323 17:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:58.893 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:58.893 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:58.893 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:58.893 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:58.893 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:58.893 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:58.893 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:58.893 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:59.153 [2024-07-15 17:30:10.222363] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:59.153 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:59.153 "name": "Existed_Raid", 00:17:59.153 "aliases": [ 00:17:59.153 "8a7fae25-5859-44e6-9abc-0deea6928e9e" 00:17:59.153 ], 00:17:59.153 "product_name": "Raid Volume", 00:17:59.153 "block_size": 512, 00:17:59.153 "num_blocks": 253952, 00:17:59.153 "uuid": "8a7fae25-5859-44e6-9abc-0deea6928e9e", 00:17:59.153 "assigned_rate_limits": { 00:17:59.153 "rw_ios_per_sec": 0, 00:17:59.153 "rw_mbytes_per_sec": 0, 00:17:59.153 "r_mbytes_per_sec": 0, 00:17:59.153 "w_mbytes_per_sec": 0 00:17:59.153 }, 00:17:59.153 "claimed": false, 00:17:59.153 "zoned": false, 00:17:59.153 "supported_io_types": { 00:17:59.153 "read": true, 00:17:59.153 "write": true, 00:17:59.153 "unmap": true, 00:17:59.153 "flush": true, 00:17:59.153 "reset": true, 00:17:59.153 "nvme_admin": false, 00:17:59.153 "nvme_io": false, 00:17:59.153 "nvme_io_md": false, 00:17:59.153 "write_zeroes": true, 00:17:59.153 "zcopy": false, 00:17:59.153 "get_zone_info": false, 00:17:59.153 "zone_management": false, 00:17:59.153 "zone_append": false, 00:17:59.153 "compare": false, 00:17:59.153 "compare_and_write": false, 00:17:59.153 "abort": false, 00:17:59.153 "seek_hole": false, 00:17:59.153 "seek_data": false, 00:17:59.153 "copy": false, 00:17:59.153 "nvme_iov_md": false 00:17:59.153 }, 00:17:59.153 "memory_domains": [ 00:17:59.153 { 00:17:59.153 "dma_device_id": "system", 00:17:59.153 "dma_device_type": 1 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.153 "dma_device_type": 2 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "dma_device_id": "system", 00:17:59.153 "dma_device_type": 1 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.153 "dma_device_type": 2 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "dma_device_id": "system", 00:17:59.153 "dma_device_type": 1 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.153 "dma_device_type": 2 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "dma_device_id": "system", 00:17:59.153 "dma_device_type": 1 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.153 "dma_device_type": 2 00:17:59.153 } 00:17:59.153 ], 00:17:59.153 "driver_specific": { 00:17:59.153 "raid": { 00:17:59.153 "uuid": "8a7fae25-5859-44e6-9abc-0deea6928e9e", 00:17:59.153 "strip_size_kb": 64, 00:17:59.153 "state": "online", 00:17:59.153 "raid_level": "concat", 00:17:59.153 "superblock": true, 00:17:59.153 "num_base_bdevs": 4, 00:17:59.153 "num_base_bdevs_discovered": 4, 00:17:59.153 "num_base_bdevs_operational": 4, 00:17:59.153 "base_bdevs_list": [ 00:17:59.153 { 00:17:59.153 "name": "BaseBdev1", 00:17:59.153 "uuid": "6011b881-e17e-4f89-9245-aec10eb86d2b", 00:17:59.153 "is_configured": true, 00:17:59.153 "data_offset": 2048, 00:17:59.153 "data_size": 63488 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "name": "BaseBdev2", 00:17:59.153 "uuid": "7d21829f-be90-453a-883f-121ab343cc75", 00:17:59.153 "is_configured": true, 00:17:59.153 "data_offset": 2048, 00:17:59.153 "data_size": 63488 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "name": "BaseBdev3", 00:17:59.153 "uuid": "26ba2e77-b13a-461c-b2c5-e4a2784f3063", 00:17:59.153 "is_configured": true, 00:17:59.153 "data_offset": 2048, 00:17:59.153 "data_size": 63488 00:17:59.153 }, 00:17:59.153 { 00:17:59.153 "name": "BaseBdev4", 00:17:59.153 "uuid": "48b009b9-0751-4109-9cdc-9f00a7e6909c", 00:17:59.153 "is_configured": true, 00:17:59.153 "data_offset": 2048, 00:17:59.153 "data_size": 63488 00:17:59.153 } 00:17:59.153 ] 00:17:59.153 } 00:17:59.153 } 00:17:59.153 }' 00:17:59.153 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:59.153 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:59.153 BaseBdev2 00:17:59.153 BaseBdev3 00:17:59.153 BaseBdev4' 00:17:59.153 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:59.153 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:59.153 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:59.413 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:59.413 "name": "BaseBdev1", 00:17:59.413 "aliases": [ 00:17:59.413 "6011b881-e17e-4f89-9245-aec10eb86d2b" 00:17:59.413 ], 00:17:59.413 "product_name": "Malloc disk", 00:17:59.413 "block_size": 512, 00:17:59.413 "num_blocks": 65536, 00:17:59.413 "uuid": "6011b881-e17e-4f89-9245-aec10eb86d2b", 00:17:59.413 "assigned_rate_limits": { 00:17:59.413 "rw_ios_per_sec": 0, 00:17:59.413 "rw_mbytes_per_sec": 0, 00:17:59.413 "r_mbytes_per_sec": 0, 00:17:59.413 "w_mbytes_per_sec": 0 00:17:59.413 }, 00:17:59.413 "claimed": true, 00:17:59.413 "claim_type": "exclusive_write", 00:17:59.413 "zoned": false, 00:17:59.413 "supported_io_types": { 00:17:59.413 "read": true, 00:17:59.413 "write": true, 00:17:59.413 "unmap": true, 00:17:59.413 "flush": true, 00:17:59.413 "reset": true, 00:17:59.413 "nvme_admin": false, 00:17:59.413 "nvme_io": false, 00:17:59.413 "nvme_io_md": false, 00:17:59.413 "write_zeroes": true, 00:17:59.413 "zcopy": true, 00:17:59.413 "get_zone_info": false, 00:17:59.413 "zone_management": false, 00:17:59.413 "zone_append": false, 00:17:59.413 "compare": false, 00:17:59.413 "compare_and_write": false, 00:17:59.413 "abort": true, 00:17:59.413 "seek_hole": false, 00:17:59.413 "seek_data": false, 00:17:59.413 "copy": true, 00:17:59.413 "nvme_iov_md": false 00:17:59.413 }, 00:17:59.413 "memory_domains": [ 00:17:59.413 { 00:17:59.413 "dma_device_id": "system", 00:17:59.413 "dma_device_type": 1 00:17:59.413 }, 00:17:59.413 { 00:17:59.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.413 "dma_device_type": 2 00:17:59.413 } 00:17:59.413 ], 00:17:59.413 "driver_specific": {} 00:17:59.413 }' 00:17:59.413 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.413 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.413 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:59.413 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.413 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.413 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:59.413 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.673 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.673 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:59.673 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.673 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.673 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.673 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:59.673 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:59.673 17:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:59.934 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:59.934 "name": "BaseBdev2", 00:17:59.934 "aliases": [ 00:17:59.934 "7d21829f-be90-453a-883f-121ab343cc75" 00:17:59.934 ], 00:17:59.934 "product_name": "Malloc disk", 00:17:59.934 "block_size": 512, 00:17:59.934 "num_blocks": 65536, 00:17:59.934 "uuid": "7d21829f-be90-453a-883f-121ab343cc75", 00:17:59.934 "assigned_rate_limits": { 00:17:59.934 "rw_ios_per_sec": 0, 00:17:59.934 "rw_mbytes_per_sec": 0, 00:17:59.934 "r_mbytes_per_sec": 0, 00:17:59.934 "w_mbytes_per_sec": 0 00:17:59.934 }, 00:17:59.934 "claimed": true, 00:17:59.934 "claim_type": "exclusive_write", 00:17:59.934 "zoned": false, 00:17:59.934 "supported_io_types": { 00:17:59.934 "read": true, 00:17:59.934 "write": true, 00:17:59.934 "unmap": true, 00:17:59.934 "flush": true, 00:17:59.934 "reset": true, 00:17:59.934 "nvme_admin": false, 00:17:59.934 "nvme_io": false, 00:17:59.934 "nvme_io_md": false, 00:17:59.934 "write_zeroes": true, 00:17:59.934 "zcopy": true, 00:17:59.934 "get_zone_info": false, 00:17:59.934 "zone_management": false, 00:17:59.934 "zone_append": false, 00:17:59.934 "compare": false, 00:17:59.934 "compare_and_write": false, 00:17:59.934 "abort": true, 00:17:59.934 "seek_hole": false, 00:17:59.934 "seek_data": false, 00:17:59.934 "copy": true, 00:17:59.934 "nvme_iov_md": false 00:17:59.934 }, 00:17:59.934 "memory_domains": [ 00:17:59.934 { 00:17:59.934 "dma_device_id": "system", 00:17:59.934 "dma_device_type": 1 00:17:59.934 }, 00:17:59.934 { 00:17:59.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.934 "dma_device_type": 2 00:17:59.934 } 00:17:59.934 ], 00:17:59.934 "driver_specific": {} 00:17:59.934 }' 00:17:59.935 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.935 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.935 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:59.935 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.935 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:00.195 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:00.455 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:00.455 "name": "BaseBdev3", 00:18:00.455 "aliases": [ 00:18:00.455 "26ba2e77-b13a-461c-b2c5-e4a2784f3063" 00:18:00.455 ], 00:18:00.455 "product_name": "Malloc disk", 00:18:00.455 "block_size": 512, 00:18:00.455 "num_blocks": 65536, 00:18:00.455 "uuid": "26ba2e77-b13a-461c-b2c5-e4a2784f3063", 00:18:00.455 "assigned_rate_limits": { 00:18:00.455 "rw_ios_per_sec": 0, 00:18:00.455 "rw_mbytes_per_sec": 0, 00:18:00.455 "r_mbytes_per_sec": 0, 00:18:00.455 "w_mbytes_per_sec": 0 00:18:00.455 }, 00:18:00.455 "claimed": true, 00:18:00.455 "claim_type": "exclusive_write", 00:18:00.455 "zoned": false, 00:18:00.455 "supported_io_types": { 00:18:00.455 "read": true, 00:18:00.455 "write": true, 00:18:00.455 "unmap": true, 00:18:00.455 "flush": true, 00:18:00.455 "reset": true, 00:18:00.455 "nvme_admin": false, 00:18:00.455 "nvme_io": false, 00:18:00.455 "nvme_io_md": false, 00:18:00.455 "write_zeroes": true, 00:18:00.455 "zcopy": true, 00:18:00.455 "get_zone_info": false, 00:18:00.455 "zone_management": false, 00:18:00.455 "zone_append": false, 00:18:00.455 "compare": false, 00:18:00.455 "compare_and_write": false, 00:18:00.455 "abort": true, 00:18:00.455 "seek_hole": false, 00:18:00.455 "seek_data": false, 00:18:00.455 "copy": true, 00:18:00.455 "nvme_iov_md": false 00:18:00.455 }, 00:18:00.455 "memory_domains": [ 00:18:00.455 { 00:18:00.455 "dma_device_id": "system", 00:18:00.455 "dma_device_type": 1 00:18:00.455 }, 00:18:00.455 { 00:18:00.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.455 "dma_device_type": 2 00:18:00.455 } 00:18:00.455 ], 00:18:00.455 "driver_specific": {} 00:18:00.455 }' 00:18:00.455 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.455 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.455 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:00.456 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.456 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:00.715 17:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:00.976 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:00.976 "name": "BaseBdev4", 00:18:00.976 "aliases": [ 00:18:00.976 "48b009b9-0751-4109-9cdc-9f00a7e6909c" 00:18:00.976 ], 00:18:00.976 "product_name": "Malloc disk", 00:18:00.976 "block_size": 512, 00:18:00.976 "num_blocks": 65536, 00:18:00.976 "uuid": "48b009b9-0751-4109-9cdc-9f00a7e6909c", 00:18:00.976 "assigned_rate_limits": { 00:18:00.976 "rw_ios_per_sec": 0, 00:18:00.976 "rw_mbytes_per_sec": 0, 00:18:00.976 "r_mbytes_per_sec": 0, 00:18:00.976 "w_mbytes_per_sec": 0 00:18:00.976 }, 00:18:00.976 "claimed": true, 00:18:00.976 "claim_type": "exclusive_write", 00:18:00.976 "zoned": false, 00:18:00.976 "supported_io_types": { 00:18:00.976 "read": true, 00:18:00.976 "write": true, 00:18:00.976 "unmap": true, 00:18:00.976 "flush": true, 00:18:00.976 "reset": true, 00:18:00.976 "nvme_admin": false, 00:18:00.976 "nvme_io": false, 00:18:00.976 "nvme_io_md": false, 00:18:00.976 "write_zeroes": true, 00:18:00.976 "zcopy": true, 00:18:00.976 "get_zone_info": false, 00:18:00.976 "zone_management": false, 00:18:00.976 "zone_append": false, 00:18:00.976 "compare": false, 00:18:00.976 "compare_and_write": false, 00:18:00.976 "abort": true, 00:18:00.976 "seek_hole": false, 00:18:00.976 "seek_data": false, 00:18:00.976 "copy": true, 00:18:00.976 "nvme_iov_md": false 00:18:00.976 }, 00:18:00.976 "memory_domains": [ 00:18:00.976 { 00:18:00.976 "dma_device_id": "system", 00:18:00.976 "dma_device_type": 1 00:18:00.976 }, 00:18:00.976 { 00:18:00.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.976 "dma_device_type": 2 00:18:00.976 } 00:18:00.976 ], 00:18:00.976 "driver_specific": {} 00:18:00.976 }' 00:18:00.976 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.976 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.976 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:00.976 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.235 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.235 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:01.235 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.235 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.235 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:01.235 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.235 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.235 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:01.235 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:01.509 [2024-07-15 17:30:12.668310] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:01.509 [2024-07-15 17:30:12.668328] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:01.509 [2024-07-15 17:30:12.668363] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.509 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.797 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.797 "name": "Existed_Raid", 00:18:01.797 "uuid": "8a7fae25-5859-44e6-9abc-0deea6928e9e", 00:18:01.797 "strip_size_kb": 64, 00:18:01.797 "state": "offline", 00:18:01.797 "raid_level": "concat", 00:18:01.797 "superblock": true, 00:18:01.797 "num_base_bdevs": 4, 00:18:01.797 "num_base_bdevs_discovered": 3, 00:18:01.797 "num_base_bdevs_operational": 3, 00:18:01.797 "base_bdevs_list": [ 00:18:01.797 { 00:18:01.797 "name": null, 00:18:01.797 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.797 "is_configured": false, 00:18:01.797 "data_offset": 2048, 00:18:01.797 "data_size": 63488 00:18:01.797 }, 00:18:01.797 { 00:18:01.797 "name": "BaseBdev2", 00:18:01.797 "uuid": "7d21829f-be90-453a-883f-121ab343cc75", 00:18:01.797 "is_configured": true, 00:18:01.797 "data_offset": 2048, 00:18:01.797 "data_size": 63488 00:18:01.797 }, 00:18:01.797 { 00:18:01.797 "name": "BaseBdev3", 00:18:01.797 "uuid": "26ba2e77-b13a-461c-b2c5-e4a2784f3063", 00:18:01.797 "is_configured": true, 00:18:01.797 "data_offset": 2048, 00:18:01.797 "data_size": 63488 00:18:01.797 }, 00:18:01.797 { 00:18:01.797 "name": "BaseBdev4", 00:18:01.797 "uuid": "48b009b9-0751-4109-9cdc-9f00a7e6909c", 00:18:01.797 "is_configured": true, 00:18:01.797 "data_offset": 2048, 00:18:01.797 "data_size": 63488 00:18:01.797 } 00:18:01.797 ] 00:18:01.797 }' 00:18:01.797 17:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.797 17:30:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:02.365 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:02.365 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:02.365 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.365 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:02.365 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:02.365 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:02.365 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:02.625 [2024-07-15 17:30:13.771115] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:02.625 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:02.625 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:02.625 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.625 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:02.886 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:02.886 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:02.887 17:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:02.887 [2024-07-15 17:30:14.153843] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:02.887 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:02.887 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:02.887 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.887 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:03.148 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:03.148 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:03.148 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:03.409 [2024-07-15 17:30:14.524478] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:03.409 [2024-07-15 17:30:14.524506] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c4fc0 name Existed_Raid, state offline 00:18:03.409 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:03.409 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:03.409 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.409 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:03.670 BaseBdev2 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:03.670 17:30:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:03.931 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:04.191 [ 00:18:04.191 { 00:18:04.191 "name": "BaseBdev2", 00:18:04.191 "aliases": [ 00:18:04.191 "a73d1240-a1fc-4df0-a881-74482809a3e4" 00:18:04.191 ], 00:18:04.191 "product_name": "Malloc disk", 00:18:04.191 "block_size": 512, 00:18:04.191 "num_blocks": 65536, 00:18:04.191 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:04.191 "assigned_rate_limits": { 00:18:04.191 "rw_ios_per_sec": 0, 00:18:04.191 "rw_mbytes_per_sec": 0, 00:18:04.191 "r_mbytes_per_sec": 0, 00:18:04.191 "w_mbytes_per_sec": 0 00:18:04.191 }, 00:18:04.191 "claimed": false, 00:18:04.191 "zoned": false, 00:18:04.191 "supported_io_types": { 00:18:04.191 "read": true, 00:18:04.191 "write": true, 00:18:04.191 "unmap": true, 00:18:04.191 "flush": true, 00:18:04.191 "reset": true, 00:18:04.191 "nvme_admin": false, 00:18:04.191 "nvme_io": false, 00:18:04.191 "nvme_io_md": false, 00:18:04.191 "write_zeroes": true, 00:18:04.191 "zcopy": true, 00:18:04.191 "get_zone_info": false, 00:18:04.191 "zone_management": false, 00:18:04.191 "zone_append": false, 00:18:04.191 "compare": false, 00:18:04.191 "compare_and_write": false, 00:18:04.191 "abort": true, 00:18:04.191 "seek_hole": false, 00:18:04.191 "seek_data": false, 00:18:04.191 "copy": true, 00:18:04.191 "nvme_iov_md": false 00:18:04.191 }, 00:18:04.191 "memory_domains": [ 00:18:04.191 { 00:18:04.191 "dma_device_id": "system", 00:18:04.191 "dma_device_type": 1 00:18:04.191 }, 00:18:04.191 { 00:18:04.191 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.191 "dma_device_type": 2 00:18:04.191 } 00:18:04.191 ], 00:18:04.191 "driver_specific": {} 00:18:04.191 } 00:18:04.191 ] 00:18:04.191 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:04.191 17:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:04.191 17:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:04.191 17:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:04.191 BaseBdev3 00:18:04.451 17:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:04.451 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:04.451 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:04.451 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:04.451 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:04.451 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:04.451 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:04.451 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:04.712 [ 00:18:04.712 { 00:18:04.712 "name": "BaseBdev3", 00:18:04.712 "aliases": [ 00:18:04.712 "9b1764cf-2466-404f-b97f-6c319fc51994" 00:18:04.712 ], 00:18:04.712 "product_name": "Malloc disk", 00:18:04.712 "block_size": 512, 00:18:04.712 "num_blocks": 65536, 00:18:04.712 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:04.712 "assigned_rate_limits": { 00:18:04.712 "rw_ios_per_sec": 0, 00:18:04.712 "rw_mbytes_per_sec": 0, 00:18:04.712 "r_mbytes_per_sec": 0, 00:18:04.712 "w_mbytes_per_sec": 0 00:18:04.712 }, 00:18:04.712 "claimed": false, 00:18:04.712 "zoned": false, 00:18:04.712 "supported_io_types": { 00:18:04.712 "read": true, 00:18:04.712 "write": true, 00:18:04.712 "unmap": true, 00:18:04.712 "flush": true, 00:18:04.712 "reset": true, 00:18:04.712 "nvme_admin": false, 00:18:04.712 "nvme_io": false, 00:18:04.712 "nvme_io_md": false, 00:18:04.712 "write_zeroes": true, 00:18:04.712 "zcopy": true, 00:18:04.712 "get_zone_info": false, 00:18:04.712 "zone_management": false, 00:18:04.712 "zone_append": false, 00:18:04.712 "compare": false, 00:18:04.712 "compare_and_write": false, 00:18:04.712 "abort": true, 00:18:04.712 "seek_hole": false, 00:18:04.712 "seek_data": false, 00:18:04.712 "copy": true, 00:18:04.712 "nvme_iov_md": false 00:18:04.712 }, 00:18:04.712 "memory_domains": [ 00:18:04.712 { 00:18:04.712 "dma_device_id": "system", 00:18:04.712 "dma_device_type": 1 00:18:04.712 }, 00:18:04.712 { 00:18:04.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.712 "dma_device_type": 2 00:18:04.712 } 00:18:04.712 ], 00:18:04.712 "driver_specific": {} 00:18:04.712 } 00:18:04.712 ] 00:18:04.712 17:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:04.712 17:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:04.712 17:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:04.712 17:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:04.972 BaseBdev4 00:18:04.972 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:04.972 17:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:04.972 17:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:04.972 17:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:04.972 17:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:04.972 17:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:04.972 17:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:04.972 17:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:05.232 [ 00:18:05.232 { 00:18:05.232 "name": "BaseBdev4", 00:18:05.232 "aliases": [ 00:18:05.232 "19096961-7587-4893-bfa1-af7f7e7738c1" 00:18:05.232 ], 00:18:05.232 "product_name": "Malloc disk", 00:18:05.232 "block_size": 512, 00:18:05.232 "num_blocks": 65536, 00:18:05.232 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:05.232 "assigned_rate_limits": { 00:18:05.232 "rw_ios_per_sec": 0, 00:18:05.232 "rw_mbytes_per_sec": 0, 00:18:05.232 "r_mbytes_per_sec": 0, 00:18:05.232 "w_mbytes_per_sec": 0 00:18:05.232 }, 00:18:05.232 "claimed": false, 00:18:05.232 "zoned": false, 00:18:05.232 "supported_io_types": { 00:18:05.232 "read": true, 00:18:05.232 "write": true, 00:18:05.232 "unmap": true, 00:18:05.232 "flush": true, 00:18:05.232 "reset": true, 00:18:05.232 "nvme_admin": false, 00:18:05.232 "nvme_io": false, 00:18:05.232 "nvme_io_md": false, 00:18:05.232 "write_zeroes": true, 00:18:05.232 "zcopy": true, 00:18:05.232 "get_zone_info": false, 00:18:05.232 "zone_management": false, 00:18:05.232 "zone_append": false, 00:18:05.232 "compare": false, 00:18:05.232 "compare_and_write": false, 00:18:05.232 "abort": true, 00:18:05.232 "seek_hole": false, 00:18:05.232 "seek_data": false, 00:18:05.232 "copy": true, 00:18:05.232 "nvme_iov_md": false 00:18:05.232 }, 00:18:05.232 "memory_domains": [ 00:18:05.232 { 00:18:05.232 "dma_device_id": "system", 00:18:05.232 "dma_device_type": 1 00:18:05.232 }, 00:18:05.232 { 00:18:05.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.232 "dma_device_type": 2 00:18:05.232 } 00:18:05.232 ], 00:18:05.232 "driver_specific": {} 00:18:05.232 } 00:18:05.232 ] 00:18:05.232 17:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:05.232 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:05.232 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:05.232 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:05.491 [2024-07-15 17:30:16.607621] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:05.491 [2024-07-15 17:30:16.607649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:05.491 [2024-07-15 17:30:16.607661] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:05.491 [2024-07-15 17:30:16.608689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:05.491 [2024-07-15 17:30:16.608725] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.491 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:05.751 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.751 "name": "Existed_Raid", 00:18:05.751 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:05.751 "strip_size_kb": 64, 00:18:05.751 "state": "configuring", 00:18:05.751 "raid_level": "concat", 00:18:05.751 "superblock": true, 00:18:05.751 "num_base_bdevs": 4, 00:18:05.751 "num_base_bdevs_discovered": 3, 00:18:05.751 "num_base_bdevs_operational": 4, 00:18:05.751 "base_bdevs_list": [ 00:18:05.751 { 00:18:05.751 "name": "BaseBdev1", 00:18:05.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.751 "is_configured": false, 00:18:05.751 "data_offset": 0, 00:18:05.751 "data_size": 0 00:18:05.751 }, 00:18:05.751 { 00:18:05.751 "name": "BaseBdev2", 00:18:05.751 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:05.751 "is_configured": true, 00:18:05.751 "data_offset": 2048, 00:18:05.751 "data_size": 63488 00:18:05.751 }, 00:18:05.751 { 00:18:05.751 "name": "BaseBdev3", 00:18:05.751 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:05.751 "is_configured": true, 00:18:05.751 "data_offset": 2048, 00:18:05.751 "data_size": 63488 00:18:05.751 }, 00:18:05.751 { 00:18:05.751 "name": "BaseBdev4", 00:18:05.751 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:05.751 "is_configured": true, 00:18:05.751 "data_offset": 2048, 00:18:05.751 "data_size": 63488 00:18:05.751 } 00:18:05.751 ] 00:18:05.751 }' 00:18:05.751 17:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.751 17:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:06.319 [2024-07-15 17:30:17.545961] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.319 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.580 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.580 "name": "Existed_Raid", 00:18:06.580 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:06.580 "strip_size_kb": 64, 00:18:06.580 "state": "configuring", 00:18:06.580 "raid_level": "concat", 00:18:06.580 "superblock": true, 00:18:06.580 "num_base_bdevs": 4, 00:18:06.580 "num_base_bdevs_discovered": 2, 00:18:06.580 "num_base_bdevs_operational": 4, 00:18:06.580 "base_bdevs_list": [ 00:18:06.580 { 00:18:06.580 "name": "BaseBdev1", 00:18:06.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.580 "is_configured": false, 00:18:06.580 "data_offset": 0, 00:18:06.580 "data_size": 0 00:18:06.580 }, 00:18:06.580 { 00:18:06.580 "name": null, 00:18:06.580 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:06.580 "is_configured": false, 00:18:06.580 "data_offset": 2048, 00:18:06.580 "data_size": 63488 00:18:06.580 }, 00:18:06.580 { 00:18:06.580 "name": "BaseBdev3", 00:18:06.580 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:06.580 "is_configured": true, 00:18:06.580 "data_offset": 2048, 00:18:06.580 "data_size": 63488 00:18:06.580 }, 00:18:06.580 { 00:18:06.580 "name": "BaseBdev4", 00:18:06.580 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:06.580 "is_configured": true, 00:18:06.580 "data_offset": 2048, 00:18:06.580 "data_size": 63488 00:18:06.580 } 00:18:06.580 ] 00:18:06.580 }' 00:18:06.580 17:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.580 17:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:07.148 17:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.148 17:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:07.407 17:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:07.407 17:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:07.407 [2024-07-15 17:30:18.661815] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:07.407 BaseBdev1 00:18:07.407 17:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:07.407 17:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:07.407 17:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:07.407 17:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:07.407 17:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:07.407 17:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:07.407 17:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:07.666 17:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:07.925 [ 00:18:07.925 { 00:18:07.925 "name": "BaseBdev1", 00:18:07.925 "aliases": [ 00:18:07.925 "b49f59a8-c87f-4337-95cf-045655adc569" 00:18:07.925 ], 00:18:07.925 "product_name": "Malloc disk", 00:18:07.925 "block_size": 512, 00:18:07.925 "num_blocks": 65536, 00:18:07.925 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:07.925 "assigned_rate_limits": { 00:18:07.925 "rw_ios_per_sec": 0, 00:18:07.925 "rw_mbytes_per_sec": 0, 00:18:07.925 "r_mbytes_per_sec": 0, 00:18:07.925 "w_mbytes_per_sec": 0 00:18:07.925 }, 00:18:07.925 "claimed": true, 00:18:07.925 "claim_type": "exclusive_write", 00:18:07.925 "zoned": false, 00:18:07.925 "supported_io_types": { 00:18:07.925 "read": true, 00:18:07.925 "write": true, 00:18:07.925 "unmap": true, 00:18:07.925 "flush": true, 00:18:07.925 "reset": true, 00:18:07.925 "nvme_admin": false, 00:18:07.925 "nvme_io": false, 00:18:07.925 "nvme_io_md": false, 00:18:07.926 "write_zeroes": true, 00:18:07.926 "zcopy": true, 00:18:07.926 "get_zone_info": false, 00:18:07.926 "zone_management": false, 00:18:07.926 "zone_append": false, 00:18:07.926 "compare": false, 00:18:07.926 "compare_and_write": false, 00:18:07.926 "abort": true, 00:18:07.926 "seek_hole": false, 00:18:07.926 "seek_data": false, 00:18:07.926 "copy": true, 00:18:07.926 "nvme_iov_md": false 00:18:07.926 }, 00:18:07.926 "memory_domains": [ 00:18:07.926 { 00:18:07.926 "dma_device_id": "system", 00:18:07.926 "dma_device_type": 1 00:18:07.926 }, 00:18:07.926 { 00:18:07.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.926 "dma_device_type": 2 00:18:07.926 } 00:18:07.926 ], 00:18:07.926 "driver_specific": {} 00:18:07.926 } 00:18:07.926 ] 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.926 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.184 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.184 "name": "Existed_Raid", 00:18:08.184 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:08.184 "strip_size_kb": 64, 00:18:08.184 "state": "configuring", 00:18:08.184 "raid_level": "concat", 00:18:08.184 "superblock": true, 00:18:08.184 "num_base_bdevs": 4, 00:18:08.184 "num_base_bdevs_discovered": 3, 00:18:08.184 "num_base_bdevs_operational": 4, 00:18:08.184 "base_bdevs_list": [ 00:18:08.184 { 00:18:08.184 "name": "BaseBdev1", 00:18:08.184 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:08.184 "is_configured": true, 00:18:08.184 "data_offset": 2048, 00:18:08.184 "data_size": 63488 00:18:08.184 }, 00:18:08.184 { 00:18:08.184 "name": null, 00:18:08.184 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:08.184 "is_configured": false, 00:18:08.184 "data_offset": 2048, 00:18:08.184 "data_size": 63488 00:18:08.184 }, 00:18:08.184 { 00:18:08.185 "name": "BaseBdev3", 00:18:08.185 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:08.185 "is_configured": true, 00:18:08.185 "data_offset": 2048, 00:18:08.185 "data_size": 63488 00:18:08.185 }, 00:18:08.185 { 00:18:08.185 "name": "BaseBdev4", 00:18:08.185 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:08.185 "is_configured": true, 00:18:08.185 "data_offset": 2048, 00:18:08.185 "data_size": 63488 00:18:08.185 } 00:18:08.185 ] 00:18:08.185 }' 00:18:08.185 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.185 17:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:08.754 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.754 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:08.754 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:08.754 17:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:09.015 [2024-07-15 17:30:20.117524] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.015 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:09.275 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.275 "name": "Existed_Raid", 00:18:09.275 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:09.275 "strip_size_kb": 64, 00:18:09.275 "state": "configuring", 00:18:09.275 "raid_level": "concat", 00:18:09.275 "superblock": true, 00:18:09.275 "num_base_bdevs": 4, 00:18:09.275 "num_base_bdevs_discovered": 2, 00:18:09.275 "num_base_bdevs_operational": 4, 00:18:09.275 "base_bdevs_list": [ 00:18:09.275 { 00:18:09.275 "name": "BaseBdev1", 00:18:09.275 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:09.275 "is_configured": true, 00:18:09.275 "data_offset": 2048, 00:18:09.275 "data_size": 63488 00:18:09.275 }, 00:18:09.275 { 00:18:09.275 "name": null, 00:18:09.275 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:09.275 "is_configured": false, 00:18:09.275 "data_offset": 2048, 00:18:09.275 "data_size": 63488 00:18:09.275 }, 00:18:09.275 { 00:18:09.275 "name": null, 00:18:09.275 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:09.275 "is_configured": false, 00:18:09.275 "data_offset": 2048, 00:18:09.275 "data_size": 63488 00:18:09.275 }, 00:18:09.275 { 00:18:09.275 "name": "BaseBdev4", 00:18:09.275 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:09.275 "is_configured": true, 00:18:09.275 "data_offset": 2048, 00:18:09.275 "data_size": 63488 00:18:09.275 } 00:18:09.275 ] 00:18:09.275 }' 00:18:09.275 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.275 17:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:09.846 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.846 17:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:09.846 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:09.846 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:10.107 [2024-07-15 17:30:21.216316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.107 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.367 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.367 "name": "Existed_Raid", 00:18:10.367 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:10.367 "strip_size_kb": 64, 00:18:10.367 "state": "configuring", 00:18:10.367 "raid_level": "concat", 00:18:10.367 "superblock": true, 00:18:10.367 "num_base_bdevs": 4, 00:18:10.367 "num_base_bdevs_discovered": 3, 00:18:10.367 "num_base_bdevs_operational": 4, 00:18:10.367 "base_bdevs_list": [ 00:18:10.367 { 00:18:10.367 "name": "BaseBdev1", 00:18:10.367 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:10.367 "is_configured": true, 00:18:10.367 "data_offset": 2048, 00:18:10.367 "data_size": 63488 00:18:10.367 }, 00:18:10.367 { 00:18:10.367 "name": null, 00:18:10.367 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:10.367 "is_configured": false, 00:18:10.367 "data_offset": 2048, 00:18:10.367 "data_size": 63488 00:18:10.367 }, 00:18:10.367 { 00:18:10.367 "name": "BaseBdev3", 00:18:10.367 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:10.367 "is_configured": true, 00:18:10.367 "data_offset": 2048, 00:18:10.367 "data_size": 63488 00:18:10.367 }, 00:18:10.367 { 00:18:10.367 "name": "BaseBdev4", 00:18:10.367 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:10.367 "is_configured": true, 00:18:10.367 "data_offset": 2048, 00:18:10.367 "data_size": 63488 00:18:10.367 } 00:18:10.367 ] 00:18:10.367 }' 00:18:10.367 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.367 17:30:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:10.939 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.939 17:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:10.939 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:10.939 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:11.199 [2024-07-15 17:30:22.335163] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.199 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.461 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.461 "name": "Existed_Raid", 00:18:11.461 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:11.461 "strip_size_kb": 64, 00:18:11.461 "state": "configuring", 00:18:11.461 "raid_level": "concat", 00:18:11.461 "superblock": true, 00:18:11.461 "num_base_bdevs": 4, 00:18:11.461 "num_base_bdevs_discovered": 2, 00:18:11.461 "num_base_bdevs_operational": 4, 00:18:11.461 "base_bdevs_list": [ 00:18:11.461 { 00:18:11.461 "name": null, 00:18:11.461 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:11.461 "is_configured": false, 00:18:11.461 "data_offset": 2048, 00:18:11.461 "data_size": 63488 00:18:11.461 }, 00:18:11.461 { 00:18:11.461 "name": null, 00:18:11.461 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:11.461 "is_configured": false, 00:18:11.461 "data_offset": 2048, 00:18:11.461 "data_size": 63488 00:18:11.461 }, 00:18:11.461 { 00:18:11.461 "name": "BaseBdev3", 00:18:11.461 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:11.461 "is_configured": true, 00:18:11.461 "data_offset": 2048, 00:18:11.461 "data_size": 63488 00:18:11.461 }, 00:18:11.461 { 00:18:11.461 "name": "BaseBdev4", 00:18:11.461 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:11.461 "is_configured": true, 00:18:11.461 "data_offset": 2048, 00:18:11.461 "data_size": 63488 00:18:11.461 } 00:18:11.461 ] 00:18:11.461 }' 00:18:11.461 17:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.461 17:30:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:12.032 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.032 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:12.032 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:12.032 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:12.292 [2024-07-15 17:30:23.475834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.292 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:12.552 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.552 "name": "Existed_Raid", 00:18:12.552 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:12.552 "strip_size_kb": 64, 00:18:12.552 "state": "configuring", 00:18:12.552 "raid_level": "concat", 00:18:12.552 "superblock": true, 00:18:12.552 "num_base_bdevs": 4, 00:18:12.552 "num_base_bdevs_discovered": 3, 00:18:12.552 "num_base_bdevs_operational": 4, 00:18:12.552 "base_bdevs_list": [ 00:18:12.552 { 00:18:12.552 "name": null, 00:18:12.552 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:12.552 "is_configured": false, 00:18:12.552 "data_offset": 2048, 00:18:12.552 "data_size": 63488 00:18:12.552 }, 00:18:12.552 { 00:18:12.552 "name": "BaseBdev2", 00:18:12.552 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:12.552 "is_configured": true, 00:18:12.552 "data_offset": 2048, 00:18:12.552 "data_size": 63488 00:18:12.552 }, 00:18:12.552 { 00:18:12.552 "name": "BaseBdev3", 00:18:12.552 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:12.552 "is_configured": true, 00:18:12.552 "data_offset": 2048, 00:18:12.552 "data_size": 63488 00:18:12.552 }, 00:18:12.552 { 00:18:12.552 "name": "BaseBdev4", 00:18:12.552 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:12.552 "is_configured": true, 00:18:12.552 "data_offset": 2048, 00:18:12.552 "data_size": 63488 00:18:12.552 } 00:18:12.552 ] 00:18:12.552 }' 00:18:12.552 17:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.552 17:30:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:13.121 17:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:13.121 17:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.382 17:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:13.382 17:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.382 17:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:13.382 17:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b49f59a8-c87f-4337-95cf-045655adc569 00:18:13.642 [2024-07-15 17:30:24.808237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:13.642 [2024-07-15 17:30:24.808355] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c7080 00:18:13.642 [2024-07-15 17:30:24.808363] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:13.642 [2024-07-15 17:30:24.808507] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26bcc30 00:18:13.642 [2024-07-15 17:30:24.808597] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c7080 00:18:13.642 [2024-07-15 17:30:24.808602] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x26c7080 00:18:13.642 [2024-07-15 17:30:24.808667] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:13.642 NewBaseBdev 00:18:13.642 17:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:13.642 17:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:13.642 17:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:13.642 17:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:13.642 17:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:13.642 17:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:13.642 17:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:13.902 17:30:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:13.902 [ 00:18:13.902 { 00:18:13.902 "name": "NewBaseBdev", 00:18:13.902 "aliases": [ 00:18:13.902 "b49f59a8-c87f-4337-95cf-045655adc569" 00:18:13.902 ], 00:18:13.902 "product_name": "Malloc disk", 00:18:13.902 "block_size": 512, 00:18:13.902 "num_blocks": 65536, 00:18:13.902 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:13.902 "assigned_rate_limits": { 00:18:13.902 "rw_ios_per_sec": 0, 00:18:13.902 "rw_mbytes_per_sec": 0, 00:18:13.902 "r_mbytes_per_sec": 0, 00:18:13.902 "w_mbytes_per_sec": 0 00:18:13.902 }, 00:18:13.902 "claimed": true, 00:18:13.902 "claim_type": "exclusive_write", 00:18:13.902 "zoned": false, 00:18:13.902 "supported_io_types": { 00:18:13.903 "read": true, 00:18:13.903 "write": true, 00:18:13.903 "unmap": true, 00:18:13.903 "flush": true, 00:18:13.903 "reset": true, 00:18:13.903 "nvme_admin": false, 00:18:13.903 "nvme_io": false, 00:18:13.903 "nvme_io_md": false, 00:18:13.903 "write_zeroes": true, 00:18:13.903 "zcopy": true, 00:18:13.903 "get_zone_info": false, 00:18:13.903 "zone_management": false, 00:18:13.903 "zone_append": false, 00:18:13.903 "compare": false, 00:18:13.903 "compare_and_write": false, 00:18:13.903 "abort": true, 00:18:13.903 "seek_hole": false, 00:18:13.903 "seek_data": false, 00:18:13.903 "copy": true, 00:18:13.903 "nvme_iov_md": false 00:18:13.903 }, 00:18:13.903 "memory_domains": [ 00:18:13.903 { 00:18:13.903 "dma_device_id": "system", 00:18:13.903 "dma_device_type": 1 00:18:13.903 }, 00:18:13.903 { 00:18:13.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.903 "dma_device_type": 2 00:18:13.903 } 00:18:13.903 ], 00:18:13.903 "driver_specific": {} 00:18:13.903 } 00:18:13.903 ] 00:18:13.903 17:30:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:14.162 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:14.162 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.162 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:14.162 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:14.162 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:14.162 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.162 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.163 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.163 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.163 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.163 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.163 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:14.163 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.163 "name": "Existed_Raid", 00:18:14.163 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:14.163 "strip_size_kb": 64, 00:18:14.163 "state": "online", 00:18:14.163 "raid_level": "concat", 00:18:14.163 "superblock": true, 00:18:14.163 "num_base_bdevs": 4, 00:18:14.163 "num_base_bdevs_discovered": 4, 00:18:14.163 "num_base_bdevs_operational": 4, 00:18:14.163 "base_bdevs_list": [ 00:18:14.163 { 00:18:14.163 "name": "NewBaseBdev", 00:18:14.163 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:14.163 "is_configured": true, 00:18:14.163 "data_offset": 2048, 00:18:14.163 "data_size": 63488 00:18:14.163 }, 00:18:14.163 { 00:18:14.163 "name": "BaseBdev2", 00:18:14.163 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:14.163 "is_configured": true, 00:18:14.163 "data_offset": 2048, 00:18:14.163 "data_size": 63488 00:18:14.163 }, 00:18:14.163 { 00:18:14.163 "name": "BaseBdev3", 00:18:14.163 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:14.163 "is_configured": true, 00:18:14.163 "data_offset": 2048, 00:18:14.163 "data_size": 63488 00:18:14.163 }, 00:18:14.163 { 00:18:14.163 "name": "BaseBdev4", 00:18:14.163 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:14.163 "is_configured": true, 00:18:14.163 "data_offset": 2048, 00:18:14.163 "data_size": 63488 00:18:14.163 } 00:18:14.163 ] 00:18:14.163 }' 00:18:14.163 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.163 17:30:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:14.733 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:14.733 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:14.733 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:14.733 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:14.734 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:14.734 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:14.734 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:14.734 17:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:14.993 [2024-07-15 17:30:26.115814] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:14.993 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:14.993 "name": "Existed_Raid", 00:18:14.993 "aliases": [ 00:18:14.993 "10f6b2d1-3753-41b1-b0a0-e19636cd0294" 00:18:14.993 ], 00:18:14.993 "product_name": "Raid Volume", 00:18:14.993 "block_size": 512, 00:18:14.993 "num_blocks": 253952, 00:18:14.993 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:14.993 "assigned_rate_limits": { 00:18:14.993 "rw_ios_per_sec": 0, 00:18:14.993 "rw_mbytes_per_sec": 0, 00:18:14.994 "r_mbytes_per_sec": 0, 00:18:14.994 "w_mbytes_per_sec": 0 00:18:14.994 }, 00:18:14.994 "claimed": false, 00:18:14.994 "zoned": false, 00:18:14.994 "supported_io_types": { 00:18:14.994 "read": true, 00:18:14.994 "write": true, 00:18:14.994 "unmap": true, 00:18:14.994 "flush": true, 00:18:14.994 "reset": true, 00:18:14.994 "nvme_admin": false, 00:18:14.994 "nvme_io": false, 00:18:14.994 "nvme_io_md": false, 00:18:14.994 "write_zeroes": true, 00:18:14.994 "zcopy": false, 00:18:14.994 "get_zone_info": false, 00:18:14.994 "zone_management": false, 00:18:14.994 "zone_append": false, 00:18:14.994 "compare": false, 00:18:14.994 "compare_and_write": false, 00:18:14.994 "abort": false, 00:18:14.994 "seek_hole": false, 00:18:14.994 "seek_data": false, 00:18:14.994 "copy": false, 00:18:14.994 "nvme_iov_md": false 00:18:14.994 }, 00:18:14.994 "memory_domains": [ 00:18:14.994 { 00:18:14.994 "dma_device_id": "system", 00:18:14.994 "dma_device_type": 1 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.994 "dma_device_type": 2 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "dma_device_id": "system", 00:18:14.994 "dma_device_type": 1 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.994 "dma_device_type": 2 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "dma_device_id": "system", 00:18:14.994 "dma_device_type": 1 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.994 "dma_device_type": 2 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "dma_device_id": "system", 00:18:14.994 "dma_device_type": 1 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.994 "dma_device_type": 2 00:18:14.994 } 00:18:14.994 ], 00:18:14.994 "driver_specific": { 00:18:14.994 "raid": { 00:18:14.994 "uuid": "10f6b2d1-3753-41b1-b0a0-e19636cd0294", 00:18:14.994 "strip_size_kb": 64, 00:18:14.994 "state": "online", 00:18:14.994 "raid_level": "concat", 00:18:14.994 "superblock": true, 00:18:14.994 "num_base_bdevs": 4, 00:18:14.994 "num_base_bdevs_discovered": 4, 00:18:14.994 "num_base_bdevs_operational": 4, 00:18:14.994 "base_bdevs_list": [ 00:18:14.994 { 00:18:14.994 "name": "NewBaseBdev", 00:18:14.994 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:14.994 "is_configured": true, 00:18:14.994 "data_offset": 2048, 00:18:14.994 "data_size": 63488 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "name": "BaseBdev2", 00:18:14.994 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:14.994 "is_configured": true, 00:18:14.994 "data_offset": 2048, 00:18:14.994 "data_size": 63488 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "name": "BaseBdev3", 00:18:14.994 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:14.994 "is_configured": true, 00:18:14.994 "data_offset": 2048, 00:18:14.994 "data_size": 63488 00:18:14.994 }, 00:18:14.994 { 00:18:14.994 "name": "BaseBdev4", 00:18:14.994 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:14.994 "is_configured": true, 00:18:14.994 "data_offset": 2048, 00:18:14.994 "data_size": 63488 00:18:14.994 } 00:18:14.994 ] 00:18:14.994 } 00:18:14.994 } 00:18:14.994 }' 00:18:14.994 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:14.994 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:14.994 BaseBdev2 00:18:14.994 BaseBdev3 00:18:14.994 BaseBdev4' 00:18:14.994 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:14.994 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:14.994 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:15.255 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:15.255 "name": "NewBaseBdev", 00:18:15.255 "aliases": [ 00:18:15.255 "b49f59a8-c87f-4337-95cf-045655adc569" 00:18:15.255 ], 00:18:15.255 "product_name": "Malloc disk", 00:18:15.255 "block_size": 512, 00:18:15.255 "num_blocks": 65536, 00:18:15.255 "uuid": "b49f59a8-c87f-4337-95cf-045655adc569", 00:18:15.255 "assigned_rate_limits": { 00:18:15.255 "rw_ios_per_sec": 0, 00:18:15.255 "rw_mbytes_per_sec": 0, 00:18:15.255 "r_mbytes_per_sec": 0, 00:18:15.255 "w_mbytes_per_sec": 0 00:18:15.255 }, 00:18:15.255 "claimed": true, 00:18:15.255 "claim_type": "exclusive_write", 00:18:15.255 "zoned": false, 00:18:15.255 "supported_io_types": { 00:18:15.255 "read": true, 00:18:15.255 "write": true, 00:18:15.255 "unmap": true, 00:18:15.255 "flush": true, 00:18:15.255 "reset": true, 00:18:15.255 "nvme_admin": false, 00:18:15.255 "nvme_io": false, 00:18:15.255 "nvme_io_md": false, 00:18:15.255 "write_zeroes": true, 00:18:15.255 "zcopy": true, 00:18:15.255 "get_zone_info": false, 00:18:15.255 "zone_management": false, 00:18:15.255 "zone_append": false, 00:18:15.255 "compare": false, 00:18:15.255 "compare_and_write": false, 00:18:15.255 "abort": true, 00:18:15.255 "seek_hole": false, 00:18:15.255 "seek_data": false, 00:18:15.255 "copy": true, 00:18:15.255 "nvme_iov_md": false 00:18:15.255 }, 00:18:15.255 "memory_domains": [ 00:18:15.255 { 00:18:15.255 "dma_device_id": "system", 00:18:15.255 "dma_device_type": 1 00:18:15.255 }, 00:18:15.255 { 00:18:15.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.255 "dma_device_type": 2 00:18:15.255 } 00:18:15.255 ], 00:18:15.255 "driver_specific": {} 00:18:15.255 }' 00:18:15.255 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.255 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.255 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:15.255 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.255 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.515 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:15.515 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.515 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.515 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:15.515 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.515 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.516 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:15.516 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:15.516 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:15.516 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:15.775 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:15.775 "name": "BaseBdev2", 00:18:15.775 "aliases": [ 00:18:15.775 "a73d1240-a1fc-4df0-a881-74482809a3e4" 00:18:15.775 ], 00:18:15.775 "product_name": "Malloc disk", 00:18:15.775 "block_size": 512, 00:18:15.775 "num_blocks": 65536, 00:18:15.775 "uuid": "a73d1240-a1fc-4df0-a881-74482809a3e4", 00:18:15.775 "assigned_rate_limits": { 00:18:15.775 "rw_ios_per_sec": 0, 00:18:15.775 "rw_mbytes_per_sec": 0, 00:18:15.775 "r_mbytes_per_sec": 0, 00:18:15.775 "w_mbytes_per_sec": 0 00:18:15.775 }, 00:18:15.775 "claimed": true, 00:18:15.775 "claim_type": "exclusive_write", 00:18:15.775 "zoned": false, 00:18:15.775 "supported_io_types": { 00:18:15.775 "read": true, 00:18:15.775 "write": true, 00:18:15.775 "unmap": true, 00:18:15.776 "flush": true, 00:18:15.776 "reset": true, 00:18:15.776 "nvme_admin": false, 00:18:15.776 "nvme_io": false, 00:18:15.776 "nvme_io_md": false, 00:18:15.776 "write_zeroes": true, 00:18:15.776 "zcopy": true, 00:18:15.776 "get_zone_info": false, 00:18:15.776 "zone_management": false, 00:18:15.776 "zone_append": false, 00:18:15.776 "compare": false, 00:18:15.776 "compare_and_write": false, 00:18:15.776 "abort": true, 00:18:15.776 "seek_hole": false, 00:18:15.776 "seek_data": false, 00:18:15.776 "copy": true, 00:18:15.776 "nvme_iov_md": false 00:18:15.776 }, 00:18:15.776 "memory_domains": [ 00:18:15.776 { 00:18:15.776 "dma_device_id": "system", 00:18:15.776 "dma_device_type": 1 00:18:15.776 }, 00:18:15.776 { 00:18:15.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.776 "dma_device_type": 2 00:18:15.776 } 00:18:15.776 ], 00:18:15.776 "driver_specific": {} 00:18:15.776 }' 00:18:15.776 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.776 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.776 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:15.776 17:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.776 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:16.035 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:16.296 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:16.296 "name": "BaseBdev3", 00:18:16.296 "aliases": [ 00:18:16.296 "9b1764cf-2466-404f-b97f-6c319fc51994" 00:18:16.296 ], 00:18:16.296 "product_name": "Malloc disk", 00:18:16.296 "block_size": 512, 00:18:16.296 "num_blocks": 65536, 00:18:16.296 "uuid": "9b1764cf-2466-404f-b97f-6c319fc51994", 00:18:16.296 "assigned_rate_limits": { 00:18:16.296 "rw_ios_per_sec": 0, 00:18:16.296 "rw_mbytes_per_sec": 0, 00:18:16.296 "r_mbytes_per_sec": 0, 00:18:16.296 "w_mbytes_per_sec": 0 00:18:16.296 }, 00:18:16.296 "claimed": true, 00:18:16.296 "claim_type": "exclusive_write", 00:18:16.296 "zoned": false, 00:18:16.296 "supported_io_types": { 00:18:16.296 "read": true, 00:18:16.296 "write": true, 00:18:16.296 "unmap": true, 00:18:16.296 "flush": true, 00:18:16.296 "reset": true, 00:18:16.296 "nvme_admin": false, 00:18:16.296 "nvme_io": false, 00:18:16.296 "nvme_io_md": false, 00:18:16.296 "write_zeroes": true, 00:18:16.296 "zcopy": true, 00:18:16.296 "get_zone_info": false, 00:18:16.296 "zone_management": false, 00:18:16.296 "zone_append": false, 00:18:16.296 "compare": false, 00:18:16.296 "compare_and_write": false, 00:18:16.296 "abort": true, 00:18:16.296 "seek_hole": false, 00:18:16.296 "seek_data": false, 00:18:16.296 "copy": true, 00:18:16.296 "nvme_iov_md": false 00:18:16.296 }, 00:18:16.296 "memory_domains": [ 00:18:16.296 { 00:18:16.296 "dma_device_id": "system", 00:18:16.296 "dma_device_type": 1 00:18:16.296 }, 00:18:16.296 { 00:18:16.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.296 "dma_device_type": 2 00:18:16.296 } 00:18:16.296 ], 00:18:16.296 "driver_specific": {} 00:18:16.296 }' 00:18:16.296 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.296 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.296 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:16.296 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:16.556 17:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:16.816 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:16.816 "name": "BaseBdev4", 00:18:16.816 "aliases": [ 00:18:16.816 "19096961-7587-4893-bfa1-af7f7e7738c1" 00:18:16.816 ], 00:18:16.816 "product_name": "Malloc disk", 00:18:16.816 "block_size": 512, 00:18:16.816 "num_blocks": 65536, 00:18:16.816 "uuid": "19096961-7587-4893-bfa1-af7f7e7738c1", 00:18:16.816 "assigned_rate_limits": { 00:18:16.816 "rw_ios_per_sec": 0, 00:18:16.816 "rw_mbytes_per_sec": 0, 00:18:16.816 "r_mbytes_per_sec": 0, 00:18:16.816 "w_mbytes_per_sec": 0 00:18:16.816 }, 00:18:16.816 "claimed": true, 00:18:16.816 "claim_type": "exclusive_write", 00:18:16.816 "zoned": false, 00:18:16.816 "supported_io_types": { 00:18:16.816 "read": true, 00:18:16.816 "write": true, 00:18:16.816 "unmap": true, 00:18:16.816 "flush": true, 00:18:16.816 "reset": true, 00:18:16.816 "nvme_admin": false, 00:18:16.816 "nvme_io": false, 00:18:16.816 "nvme_io_md": false, 00:18:16.816 "write_zeroes": true, 00:18:16.816 "zcopy": true, 00:18:16.816 "get_zone_info": false, 00:18:16.816 "zone_management": false, 00:18:16.816 "zone_append": false, 00:18:16.816 "compare": false, 00:18:16.816 "compare_and_write": false, 00:18:16.816 "abort": true, 00:18:16.816 "seek_hole": false, 00:18:16.816 "seek_data": false, 00:18:16.816 "copy": true, 00:18:16.816 "nvme_iov_md": false 00:18:16.816 }, 00:18:16.816 "memory_domains": [ 00:18:16.816 { 00:18:16.816 "dma_device_id": "system", 00:18:16.816 "dma_device_type": 1 00:18:16.816 }, 00:18:16.816 { 00:18:16.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.816 "dma_device_type": 2 00:18:16.816 } 00:18:16.816 ], 00:18:16.816 "driver_specific": {} 00:18:16.816 }' 00:18:16.816 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.816 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.816 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:16.816 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.076 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.076 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.076 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.076 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.076 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.076 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.076 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.076 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:17.076 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:17.335 [2024-07-15 17:30:28.537705] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:17.335 [2024-07-15 17:30:28.537735] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:17.335 [2024-07-15 17:30:28.537773] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:17.335 [2024-07-15 17:30:28.537817] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:17.335 [2024-07-15 17:30:28.537823] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c7080 name Existed_Raid, state offline 00:18:17.335 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2827762 00:18:17.335 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2827762 ']' 00:18:17.335 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2827762 00:18:17.335 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:17.335 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:17.335 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2827762 00:18:17.335 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:17.336 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:17.336 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2827762' 00:18:17.336 killing process with pid 2827762 00:18:17.336 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2827762 00:18:17.336 [2024-07-15 17:30:28.604976] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:17.336 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2827762 00:18:17.336 [2024-07-15 17:30:28.625373] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:17.596 17:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:17.596 00:18:17.596 real 0m27.392s 00:18:17.596 user 0m51.372s 00:18:17.596 sys 0m4.010s 00:18:17.596 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:17.596 17:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:17.596 ************************************ 00:18:17.596 END TEST raid_state_function_test_sb 00:18:17.596 ************************************ 00:18:17.596 17:30:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:17.596 17:30:28 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:18:17.596 17:30:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:17.596 17:30:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:17.596 17:30:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:17.596 ************************************ 00:18:17.596 START TEST raid_superblock_test 00:18:17.596 ************************************ 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:18:17.596 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2833481 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2833481 /var/tmp/spdk-raid.sock 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2833481 ']' 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:17.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:17.597 17:30:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:17.597 [2024-07-15 17:30:28.881355] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:18:17.597 [2024-07-15 17:30:28.881407] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2833481 ] 00:18:17.858 [2024-07-15 17:30:28.971610] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:17.858 [2024-07-15 17:30:29.039680] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:17.858 [2024-07-15 17:30:29.088233] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:17.858 [2024-07-15 17:30:29.088259] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:18.466 17:30:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:18.466 17:30:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:18.466 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:18.466 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:18.467 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:18.467 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:18.467 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:18.467 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:18.467 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:18.467 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:18.467 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:18.728 malloc1 00:18:18.728 17:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:18.988 [2024-07-15 17:30:30.058655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:18.988 [2024-07-15 17:30:30.058692] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:18.988 [2024-07-15 17:30:30.058706] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a0a20 00:18:18.988 [2024-07-15 17:30:30.058717] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:18.988 [2024-07-15 17:30:30.060040] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:18.988 [2024-07-15 17:30:30.060061] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:18.988 pt1 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:18.988 malloc2 00:18:18.988 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:19.248 [2024-07-15 17:30:30.457497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:19.248 [2024-07-15 17:30:30.457524] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.248 [2024-07-15 17:30:30.457535] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a1040 00:18:19.248 [2024-07-15 17:30:30.457542] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.248 [2024-07-15 17:30:30.458731] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.248 [2024-07-15 17:30:30.458749] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:19.248 pt2 00:18:19.248 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:19.248 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:19.248 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:19.248 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:19.248 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:19.248 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:19.248 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:19.248 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:19.249 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:19.509 malloc3 00:18:19.509 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:19.769 [2024-07-15 17:30:30.840327] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:19.769 [2024-07-15 17:30:30.840352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.769 [2024-07-15 17:30:30.840365] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a1540 00:18:19.769 [2024-07-15 17:30:30.840371] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.769 [2024-07-15 17:30:30.841557] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.769 [2024-07-15 17:30:30.841575] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:19.769 pt3 00:18:19.769 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:19.769 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:19.769 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:19.769 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:19.769 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:19.769 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:19.769 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:19.769 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:19.769 17:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:19.769 malloc4 00:18:19.769 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:20.030 [2024-07-15 17:30:31.195050] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:20.030 [2024-07-15 17:30:31.195075] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:20.030 [2024-07-15 17:30:31.195084] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x274ed60 00:18:20.030 [2024-07-15 17:30:31.195090] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:20.030 [2024-07-15 17:30:31.196252] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:20.030 [2024-07-15 17:30:31.196271] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:20.030 pt4 00:18:20.030 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:20.030 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:20.030 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:20.291 [2024-07-15 17:30:31.383556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:20.291 [2024-07-15 17:30:31.384560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:20.291 [2024-07-15 17:30:31.384601] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:20.291 [2024-07-15 17:30:31.384634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:20.291 [2024-07-15 17:30:31.384775] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x274be20 00:18:20.291 [2024-07-15 17:30:31.384782] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:20.291 [2024-07-15 17:30:31.384932] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25a2000 00:18:20.291 [2024-07-15 17:30:31.385043] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x274be20 00:18:20.291 [2024-07-15 17:30:31.385048] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x274be20 00:18:20.291 [2024-07-15 17:30:31.385116] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.291 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:20.551 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.551 "name": "raid_bdev1", 00:18:20.551 "uuid": "c387cbb9-5e4e-4f66-9816-bc894a2815da", 00:18:20.551 "strip_size_kb": 64, 00:18:20.551 "state": "online", 00:18:20.551 "raid_level": "concat", 00:18:20.551 "superblock": true, 00:18:20.551 "num_base_bdevs": 4, 00:18:20.551 "num_base_bdevs_discovered": 4, 00:18:20.551 "num_base_bdevs_operational": 4, 00:18:20.551 "base_bdevs_list": [ 00:18:20.551 { 00:18:20.551 "name": "pt1", 00:18:20.551 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:20.551 "is_configured": true, 00:18:20.551 "data_offset": 2048, 00:18:20.551 "data_size": 63488 00:18:20.551 }, 00:18:20.551 { 00:18:20.551 "name": "pt2", 00:18:20.551 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:20.551 "is_configured": true, 00:18:20.551 "data_offset": 2048, 00:18:20.551 "data_size": 63488 00:18:20.551 }, 00:18:20.551 { 00:18:20.551 "name": "pt3", 00:18:20.551 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:20.551 "is_configured": true, 00:18:20.551 "data_offset": 2048, 00:18:20.551 "data_size": 63488 00:18:20.551 }, 00:18:20.551 { 00:18:20.551 "name": "pt4", 00:18:20.551 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:20.551 "is_configured": true, 00:18:20.551 "data_offset": 2048, 00:18:20.551 "data_size": 63488 00:18:20.551 } 00:18:20.551 ] 00:18:20.551 }' 00:18:20.551 17:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.551 17:30:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:21.122 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:21.122 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:21.122 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:21.122 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:21.122 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:21.122 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:21.122 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:21.122 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:21.122 [2024-07-15 17:30:32.322151] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:21.122 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:21.122 "name": "raid_bdev1", 00:18:21.122 "aliases": [ 00:18:21.122 "c387cbb9-5e4e-4f66-9816-bc894a2815da" 00:18:21.122 ], 00:18:21.122 "product_name": "Raid Volume", 00:18:21.122 "block_size": 512, 00:18:21.122 "num_blocks": 253952, 00:18:21.122 "uuid": "c387cbb9-5e4e-4f66-9816-bc894a2815da", 00:18:21.122 "assigned_rate_limits": { 00:18:21.122 "rw_ios_per_sec": 0, 00:18:21.122 "rw_mbytes_per_sec": 0, 00:18:21.122 "r_mbytes_per_sec": 0, 00:18:21.122 "w_mbytes_per_sec": 0 00:18:21.122 }, 00:18:21.122 "claimed": false, 00:18:21.122 "zoned": false, 00:18:21.122 "supported_io_types": { 00:18:21.122 "read": true, 00:18:21.122 "write": true, 00:18:21.122 "unmap": true, 00:18:21.122 "flush": true, 00:18:21.122 "reset": true, 00:18:21.122 "nvme_admin": false, 00:18:21.122 "nvme_io": false, 00:18:21.122 "nvme_io_md": false, 00:18:21.122 "write_zeroes": true, 00:18:21.122 "zcopy": false, 00:18:21.122 "get_zone_info": false, 00:18:21.122 "zone_management": false, 00:18:21.122 "zone_append": false, 00:18:21.122 "compare": false, 00:18:21.122 "compare_and_write": false, 00:18:21.122 "abort": false, 00:18:21.122 "seek_hole": false, 00:18:21.122 "seek_data": false, 00:18:21.122 "copy": false, 00:18:21.122 "nvme_iov_md": false 00:18:21.122 }, 00:18:21.122 "memory_domains": [ 00:18:21.122 { 00:18:21.122 "dma_device_id": "system", 00:18:21.122 "dma_device_type": 1 00:18:21.122 }, 00:18:21.122 { 00:18:21.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.122 "dma_device_type": 2 00:18:21.122 }, 00:18:21.122 { 00:18:21.122 "dma_device_id": "system", 00:18:21.122 "dma_device_type": 1 00:18:21.122 }, 00:18:21.122 { 00:18:21.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.122 "dma_device_type": 2 00:18:21.122 }, 00:18:21.122 { 00:18:21.122 "dma_device_id": "system", 00:18:21.122 "dma_device_type": 1 00:18:21.122 }, 00:18:21.122 { 00:18:21.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.122 "dma_device_type": 2 00:18:21.122 }, 00:18:21.122 { 00:18:21.122 "dma_device_id": "system", 00:18:21.122 "dma_device_type": 1 00:18:21.122 }, 00:18:21.122 { 00:18:21.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.122 "dma_device_type": 2 00:18:21.123 } 00:18:21.123 ], 00:18:21.123 "driver_specific": { 00:18:21.123 "raid": { 00:18:21.123 "uuid": "c387cbb9-5e4e-4f66-9816-bc894a2815da", 00:18:21.123 "strip_size_kb": 64, 00:18:21.123 "state": "online", 00:18:21.123 "raid_level": "concat", 00:18:21.123 "superblock": true, 00:18:21.123 "num_base_bdevs": 4, 00:18:21.123 "num_base_bdevs_discovered": 4, 00:18:21.123 "num_base_bdevs_operational": 4, 00:18:21.123 "base_bdevs_list": [ 00:18:21.123 { 00:18:21.123 "name": "pt1", 00:18:21.123 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:21.123 "is_configured": true, 00:18:21.123 "data_offset": 2048, 00:18:21.123 "data_size": 63488 00:18:21.123 }, 00:18:21.123 { 00:18:21.123 "name": "pt2", 00:18:21.123 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:21.123 "is_configured": true, 00:18:21.123 "data_offset": 2048, 00:18:21.123 "data_size": 63488 00:18:21.123 }, 00:18:21.123 { 00:18:21.123 "name": "pt3", 00:18:21.123 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:21.123 "is_configured": true, 00:18:21.123 "data_offset": 2048, 00:18:21.123 "data_size": 63488 00:18:21.123 }, 00:18:21.123 { 00:18:21.123 "name": "pt4", 00:18:21.123 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:21.123 "is_configured": true, 00:18:21.123 "data_offset": 2048, 00:18:21.123 "data_size": 63488 00:18:21.123 } 00:18:21.123 ] 00:18:21.123 } 00:18:21.123 } 00:18:21.123 }' 00:18:21.123 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:21.123 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:21.123 pt2 00:18:21.123 pt3 00:18:21.123 pt4' 00:18:21.123 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:21.123 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:21.123 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:21.383 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:21.383 "name": "pt1", 00:18:21.383 "aliases": [ 00:18:21.383 "00000000-0000-0000-0000-000000000001" 00:18:21.383 ], 00:18:21.383 "product_name": "passthru", 00:18:21.383 "block_size": 512, 00:18:21.383 "num_blocks": 65536, 00:18:21.383 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:21.383 "assigned_rate_limits": { 00:18:21.383 "rw_ios_per_sec": 0, 00:18:21.383 "rw_mbytes_per_sec": 0, 00:18:21.383 "r_mbytes_per_sec": 0, 00:18:21.383 "w_mbytes_per_sec": 0 00:18:21.383 }, 00:18:21.383 "claimed": true, 00:18:21.383 "claim_type": "exclusive_write", 00:18:21.383 "zoned": false, 00:18:21.383 "supported_io_types": { 00:18:21.383 "read": true, 00:18:21.384 "write": true, 00:18:21.384 "unmap": true, 00:18:21.384 "flush": true, 00:18:21.384 "reset": true, 00:18:21.384 "nvme_admin": false, 00:18:21.384 "nvme_io": false, 00:18:21.384 "nvme_io_md": false, 00:18:21.384 "write_zeroes": true, 00:18:21.384 "zcopy": true, 00:18:21.384 "get_zone_info": false, 00:18:21.384 "zone_management": false, 00:18:21.384 "zone_append": false, 00:18:21.384 "compare": false, 00:18:21.384 "compare_and_write": false, 00:18:21.384 "abort": true, 00:18:21.384 "seek_hole": false, 00:18:21.384 "seek_data": false, 00:18:21.384 "copy": true, 00:18:21.384 "nvme_iov_md": false 00:18:21.384 }, 00:18:21.384 "memory_domains": [ 00:18:21.384 { 00:18:21.384 "dma_device_id": "system", 00:18:21.384 "dma_device_type": 1 00:18:21.384 }, 00:18:21.384 { 00:18:21.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.384 "dma_device_type": 2 00:18:21.384 } 00:18:21.384 ], 00:18:21.384 "driver_specific": { 00:18:21.384 "passthru": { 00:18:21.384 "name": "pt1", 00:18:21.384 "base_bdev_name": "malloc1" 00:18:21.384 } 00:18:21.384 } 00:18:21.384 }' 00:18:21.384 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.384 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.384 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:21.384 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.645 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.645 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:21.645 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.645 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.645 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:21.645 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.645 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.904 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:21.904 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:21.904 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:21.904 17:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:21.904 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:21.904 "name": "pt2", 00:18:21.904 "aliases": [ 00:18:21.904 "00000000-0000-0000-0000-000000000002" 00:18:21.904 ], 00:18:21.904 "product_name": "passthru", 00:18:21.904 "block_size": 512, 00:18:21.904 "num_blocks": 65536, 00:18:21.904 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:21.904 "assigned_rate_limits": { 00:18:21.904 "rw_ios_per_sec": 0, 00:18:21.904 "rw_mbytes_per_sec": 0, 00:18:21.904 "r_mbytes_per_sec": 0, 00:18:21.904 "w_mbytes_per_sec": 0 00:18:21.904 }, 00:18:21.904 "claimed": true, 00:18:21.904 "claim_type": "exclusive_write", 00:18:21.904 "zoned": false, 00:18:21.904 "supported_io_types": { 00:18:21.904 "read": true, 00:18:21.904 "write": true, 00:18:21.904 "unmap": true, 00:18:21.904 "flush": true, 00:18:21.904 "reset": true, 00:18:21.904 "nvme_admin": false, 00:18:21.904 "nvme_io": false, 00:18:21.904 "nvme_io_md": false, 00:18:21.904 "write_zeroes": true, 00:18:21.904 "zcopy": true, 00:18:21.904 "get_zone_info": false, 00:18:21.904 "zone_management": false, 00:18:21.904 "zone_append": false, 00:18:21.904 "compare": false, 00:18:21.904 "compare_and_write": false, 00:18:21.904 "abort": true, 00:18:21.904 "seek_hole": false, 00:18:21.904 "seek_data": false, 00:18:21.904 "copy": true, 00:18:21.904 "nvme_iov_md": false 00:18:21.904 }, 00:18:21.904 "memory_domains": [ 00:18:21.904 { 00:18:21.904 "dma_device_id": "system", 00:18:21.904 "dma_device_type": 1 00:18:21.904 }, 00:18:21.904 { 00:18:21.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.904 "dma_device_type": 2 00:18:21.904 } 00:18:21.904 ], 00:18:21.904 "driver_specific": { 00:18:21.904 "passthru": { 00:18:21.904 "name": "pt2", 00:18:21.904 "base_bdev_name": "malloc2" 00:18:21.904 } 00:18:21.904 } 00:18:21.904 }' 00:18:21.904 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.905 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.163 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:22.163 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.163 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.163 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:22.163 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.163 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.163 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:22.163 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.163 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.422 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:22.422 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:22.422 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:22.422 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:22.422 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:22.422 "name": "pt3", 00:18:22.422 "aliases": [ 00:18:22.422 "00000000-0000-0000-0000-000000000003" 00:18:22.422 ], 00:18:22.422 "product_name": "passthru", 00:18:22.422 "block_size": 512, 00:18:22.422 "num_blocks": 65536, 00:18:22.422 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:22.422 "assigned_rate_limits": { 00:18:22.422 "rw_ios_per_sec": 0, 00:18:22.422 "rw_mbytes_per_sec": 0, 00:18:22.422 "r_mbytes_per_sec": 0, 00:18:22.422 "w_mbytes_per_sec": 0 00:18:22.422 }, 00:18:22.422 "claimed": true, 00:18:22.422 "claim_type": "exclusive_write", 00:18:22.422 "zoned": false, 00:18:22.422 "supported_io_types": { 00:18:22.422 "read": true, 00:18:22.422 "write": true, 00:18:22.422 "unmap": true, 00:18:22.422 "flush": true, 00:18:22.422 "reset": true, 00:18:22.422 "nvme_admin": false, 00:18:22.422 "nvme_io": false, 00:18:22.422 "nvme_io_md": false, 00:18:22.422 "write_zeroes": true, 00:18:22.422 "zcopy": true, 00:18:22.422 "get_zone_info": false, 00:18:22.422 "zone_management": false, 00:18:22.422 "zone_append": false, 00:18:22.422 "compare": false, 00:18:22.422 "compare_and_write": false, 00:18:22.422 "abort": true, 00:18:22.422 "seek_hole": false, 00:18:22.422 "seek_data": false, 00:18:22.422 "copy": true, 00:18:22.422 "nvme_iov_md": false 00:18:22.422 }, 00:18:22.422 "memory_domains": [ 00:18:22.422 { 00:18:22.422 "dma_device_id": "system", 00:18:22.422 "dma_device_type": 1 00:18:22.422 }, 00:18:22.422 { 00:18:22.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.422 "dma_device_type": 2 00:18:22.422 } 00:18:22.422 ], 00:18:22.422 "driver_specific": { 00:18:22.422 "passthru": { 00:18:22.422 "name": "pt3", 00:18:22.422 "base_bdev_name": "malloc3" 00:18:22.422 } 00:18:22.422 } 00:18:22.422 }' 00:18:22.422 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.422 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.682 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:22.682 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.682 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.682 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:22.682 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.682 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.682 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:22.682 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.941 17:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.941 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:22.941 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:22.941 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:22.941 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:22.941 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:22.941 "name": "pt4", 00:18:22.941 "aliases": [ 00:18:22.941 "00000000-0000-0000-0000-000000000004" 00:18:22.941 ], 00:18:22.941 "product_name": "passthru", 00:18:22.941 "block_size": 512, 00:18:22.941 "num_blocks": 65536, 00:18:22.941 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:22.941 "assigned_rate_limits": { 00:18:22.941 "rw_ios_per_sec": 0, 00:18:22.941 "rw_mbytes_per_sec": 0, 00:18:22.941 "r_mbytes_per_sec": 0, 00:18:22.941 "w_mbytes_per_sec": 0 00:18:22.941 }, 00:18:22.941 "claimed": true, 00:18:22.941 "claim_type": "exclusive_write", 00:18:22.941 "zoned": false, 00:18:22.941 "supported_io_types": { 00:18:22.941 "read": true, 00:18:22.941 "write": true, 00:18:22.941 "unmap": true, 00:18:22.941 "flush": true, 00:18:22.941 "reset": true, 00:18:22.941 "nvme_admin": false, 00:18:22.941 "nvme_io": false, 00:18:22.941 "nvme_io_md": false, 00:18:22.941 "write_zeroes": true, 00:18:22.941 "zcopy": true, 00:18:22.941 "get_zone_info": false, 00:18:22.941 "zone_management": false, 00:18:22.941 "zone_append": false, 00:18:22.941 "compare": false, 00:18:22.941 "compare_and_write": false, 00:18:22.941 "abort": true, 00:18:22.941 "seek_hole": false, 00:18:22.941 "seek_data": false, 00:18:22.941 "copy": true, 00:18:22.941 "nvme_iov_md": false 00:18:22.941 }, 00:18:22.941 "memory_domains": [ 00:18:22.941 { 00:18:22.941 "dma_device_id": "system", 00:18:22.941 "dma_device_type": 1 00:18:22.941 }, 00:18:22.941 { 00:18:22.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.941 "dma_device_type": 2 00:18:22.941 } 00:18:22.941 ], 00:18:22.941 "driver_specific": { 00:18:22.942 "passthru": { 00:18:22.942 "name": "pt4", 00:18:22.942 "base_bdev_name": "malloc4" 00:18:22.942 } 00:18:22.942 } 00:18:22.942 }' 00:18:22.942 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.201 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.201 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:23.201 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.201 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.201 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:23.201 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.201 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.201 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:23.201 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.461 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.461 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:23.461 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:23.461 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:23.461 [2024-07-15 17:30:34.720239] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:23.461 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c387cbb9-5e4e-4f66-9816-bc894a2815da 00:18:23.461 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c387cbb9-5e4e-4f66-9816-bc894a2815da ']' 00:18:23.461 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:23.720 [2024-07-15 17:30:34.912482] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:23.720 [2024-07-15 17:30:34.912499] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:23.720 [2024-07-15 17:30:34.912535] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:23.720 [2024-07-15 17:30:34.912580] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:23.720 [2024-07-15 17:30:34.912587] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x274be20 name raid_bdev1, state offline 00:18:23.720 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.720 17:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:23.980 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:23.980 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:23.980 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:23.980 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:24.241 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:24.241 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:24.241 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:24.241 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:24.500 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:24.500 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:24.760 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:24.760 17:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:25.020 [2024-07-15 17:30:36.251813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:25.020 [2024-07-15 17:30:36.252874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:25.020 [2024-07-15 17:30:36.252907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:25.020 [2024-07-15 17:30:36.252932] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:25.020 [2024-07-15 17:30:36.252966] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:25.020 [2024-07-15 17:30:36.252994] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:25.020 [2024-07-15 17:30:36.253008] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:25.020 [2024-07-15 17:30:36.253021] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:25.020 [2024-07-15 17:30:36.253031] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:25.020 [2024-07-15 17:30:36.253037] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2752350 name raid_bdev1, state configuring 00:18:25.020 request: 00:18:25.020 { 00:18:25.020 "name": "raid_bdev1", 00:18:25.020 "raid_level": "concat", 00:18:25.020 "base_bdevs": [ 00:18:25.020 "malloc1", 00:18:25.020 "malloc2", 00:18:25.020 "malloc3", 00:18:25.020 "malloc4" 00:18:25.020 ], 00:18:25.020 "strip_size_kb": 64, 00:18:25.020 "superblock": false, 00:18:25.020 "method": "bdev_raid_create", 00:18:25.020 "req_id": 1 00:18:25.020 } 00:18:25.020 Got JSON-RPC error response 00:18:25.020 response: 00:18:25.020 { 00:18:25.020 "code": -17, 00:18:25.020 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:25.020 } 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.020 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:25.279 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:25.279 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:25.279 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:25.538 [2024-07-15 17:30:36.636740] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:25.538 [2024-07-15 17:30:36.636762] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:25.538 [2024-07-15 17:30:36.636772] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a1e00 00:18:25.538 [2024-07-15 17:30:36.636778] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:25.538 [2024-07-15 17:30:36.638028] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:25.538 [2024-07-15 17:30:36.638049] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:25.538 [2024-07-15 17:30:36.638093] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:25.538 [2024-07-15 17:30:36.638112] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:25.538 pt1 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.538 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.539 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.539 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:25.798 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.798 "name": "raid_bdev1", 00:18:25.798 "uuid": "c387cbb9-5e4e-4f66-9816-bc894a2815da", 00:18:25.798 "strip_size_kb": 64, 00:18:25.798 "state": "configuring", 00:18:25.798 "raid_level": "concat", 00:18:25.798 "superblock": true, 00:18:25.798 "num_base_bdevs": 4, 00:18:25.798 "num_base_bdevs_discovered": 1, 00:18:25.798 "num_base_bdevs_operational": 4, 00:18:25.798 "base_bdevs_list": [ 00:18:25.798 { 00:18:25.798 "name": "pt1", 00:18:25.798 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:25.798 "is_configured": true, 00:18:25.798 "data_offset": 2048, 00:18:25.798 "data_size": 63488 00:18:25.798 }, 00:18:25.798 { 00:18:25.798 "name": null, 00:18:25.798 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:25.798 "is_configured": false, 00:18:25.798 "data_offset": 2048, 00:18:25.798 "data_size": 63488 00:18:25.798 }, 00:18:25.798 { 00:18:25.798 "name": null, 00:18:25.798 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:25.798 "is_configured": false, 00:18:25.798 "data_offset": 2048, 00:18:25.798 "data_size": 63488 00:18:25.798 }, 00:18:25.798 { 00:18:25.798 "name": null, 00:18:25.798 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:25.798 "is_configured": false, 00:18:25.798 "data_offset": 2048, 00:18:25.798 "data_size": 63488 00:18:25.798 } 00:18:25.798 ] 00:18:25.798 }' 00:18:25.798 17:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.798 17:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:26.366 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:26.366 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:26.366 [2024-07-15 17:30:37.567105] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:26.366 [2024-07-15 17:30:37.567131] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:26.366 [2024-07-15 17:30:37.567140] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2749c80 00:18:26.366 [2024-07-15 17:30:37.567146] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:26.366 [2024-07-15 17:30:37.567396] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:26.366 [2024-07-15 17:30:37.567406] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:26.366 [2024-07-15 17:30:37.567444] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:26.366 [2024-07-15 17:30:37.567456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:26.366 pt2 00:18:26.366 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:26.626 [2024-07-15 17:30:37.759595] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.626 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:26.885 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.885 "name": "raid_bdev1", 00:18:26.885 "uuid": "c387cbb9-5e4e-4f66-9816-bc894a2815da", 00:18:26.885 "strip_size_kb": 64, 00:18:26.885 "state": "configuring", 00:18:26.885 "raid_level": "concat", 00:18:26.885 "superblock": true, 00:18:26.885 "num_base_bdevs": 4, 00:18:26.885 "num_base_bdevs_discovered": 1, 00:18:26.885 "num_base_bdevs_operational": 4, 00:18:26.885 "base_bdevs_list": [ 00:18:26.885 { 00:18:26.885 "name": "pt1", 00:18:26.885 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:26.885 "is_configured": true, 00:18:26.885 "data_offset": 2048, 00:18:26.885 "data_size": 63488 00:18:26.885 }, 00:18:26.885 { 00:18:26.885 "name": null, 00:18:26.885 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:26.885 "is_configured": false, 00:18:26.885 "data_offset": 2048, 00:18:26.885 "data_size": 63488 00:18:26.885 }, 00:18:26.885 { 00:18:26.885 "name": null, 00:18:26.885 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:26.885 "is_configured": false, 00:18:26.885 "data_offset": 2048, 00:18:26.885 "data_size": 63488 00:18:26.885 }, 00:18:26.885 { 00:18:26.885 "name": null, 00:18:26.885 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:26.885 "is_configured": false, 00:18:26.885 "data_offset": 2048, 00:18:26.885 "data_size": 63488 00:18:26.885 } 00:18:26.885 ] 00:18:26.885 }' 00:18:26.885 17:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.885 17:30:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:27.454 17:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:27.454 17:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:27.454 17:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:27.454 [2024-07-15 17:30:38.669896] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:27.454 [2024-07-15 17:30:38.669926] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.454 [2024-07-15 17:30:38.669937] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2750570 00:18:27.454 [2024-07-15 17:30:38.669944] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.454 [2024-07-15 17:30:38.670198] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.454 [2024-07-15 17:30:38.670208] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:27.454 [2024-07-15 17:30:38.670248] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:27.454 [2024-07-15 17:30:38.670260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:27.454 pt2 00:18:27.454 17:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:27.455 17:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:27.455 17:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:27.715 [2024-07-15 17:30:38.846341] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:27.715 [2024-07-15 17:30:38.846359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.715 [2024-07-15 17:30:38.846366] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2750b00 00:18:27.715 [2024-07-15 17:30:38.846372] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.715 [2024-07-15 17:30:38.846585] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.715 [2024-07-15 17:30:38.846595] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:27.715 [2024-07-15 17:30:38.846626] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:27.715 [2024-07-15 17:30:38.846636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:27.715 pt3 00:18:27.715 17:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:27.715 17:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:27.715 17:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:27.974 [2024-07-15 17:30:39.018784] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:27.974 [2024-07-15 17:30:39.018806] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.974 [2024-07-15 17:30:39.018815] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259f450 00:18:27.974 [2024-07-15 17:30:39.018821] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.974 [2024-07-15 17:30:39.019038] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.974 [2024-07-15 17:30:39.019047] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:27.974 [2024-07-15 17:30:39.019080] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:27.974 [2024-07-15 17:30:39.019091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:27.974 [2024-07-15 17:30:39.019180] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x274b840 00:18:27.974 [2024-07-15 17:30:39.019186] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:27.974 [2024-07-15 17:30:39.019325] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x274fe20 00:18:27.974 [2024-07-15 17:30:39.019425] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x274b840 00:18:27.974 [2024-07-15 17:30:39.019430] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x274b840 00:18:27.974 [2024-07-15 17:30:39.019505] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:27.974 pt4 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.974 "name": "raid_bdev1", 00:18:27.974 "uuid": "c387cbb9-5e4e-4f66-9816-bc894a2815da", 00:18:27.974 "strip_size_kb": 64, 00:18:27.974 "state": "online", 00:18:27.974 "raid_level": "concat", 00:18:27.974 "superblock": true, 00:18:27.974 "num_base_bdevs": 4, 00:18:27.974 "num_base_bdevs_discovered": 4, 00:18:27.974 "num_base_bdevs_operational": 4, 00:18:27.974 "base_bdevs_list": [ 00:18:27.974 { 00:18:27.974 "name": "pt1", 00:18:27.974 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:27.974 "is_configured": true, 00:18:27.974 "data_offset": 2048, 00:18:27.974 "data_size": 63488 00:18:27.974 }, 00:18:27.974 { 00:18:27.974 "name": "pt2", 00:18:27.974 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:27.974 "is_configured": true, 00:18:27.974 "data_offset": 2048, 00:18:27.974 "data_size": 63488 00:18:27.974 }, 00:18:27.974 { 00:18:27.974 "name": "pt3", 00:18:27.974 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:27.974 "is_configured": true, 00:18:27.974 "data_offset": 2048, 00:18:27.974 "data_size": 63488 00:18:27.974 }, 00:18:27.974 { 00:18:27.974 "name": "pt4", 00:18:27.974 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:27.974 "is_configured": true, 00:18:27.974 "data_offset": 2048, 00:18:27.974 "data_size": 63488 00:18:27.974 } 00:18:27.974 ] 00:18:27.974 }' 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.974 17:30:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.541 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:28.541 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:28.541 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:28.541 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:28.541 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:28.541 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:28.541 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:28.541 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:28.802 [2024-07-15 17:30:39.957394] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:28.802 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:28.802 "name": "raid_bdev1", 00:18:28.802 "aliases": [ 00:18:28.802 "c387cbb9-5e4e-4f66-9816-bc894a2815da" 00:18:28.802 ], 00:18:28.802 "product_name": "Raid Volume", 00:18:28.802 "block_size": 512, 00:18:28.802 "num_blocks": 253952, 00:18:28.802 "uuid": "c387cbb9-5e4e-4f66-9816-bc894a2815da", 00:18:28.802 "assigned_rate_limits": { 00:18:28.802 "rw_ios_per_sec": 0, 00:18:28.802 "rw_mbytes_per_sec": 0, 00:18:28.802 "r_mbytes_per_sec": 0, 00:18:28.802 "w_mbytes_per_sec": 0 00:18:28.802 }, 00:18:28.802 "claimed": false, 00:18:28.802 "zoned": false, 00:18:28.802 "supported_io_types": { 00:18:28.802 "read": true, 00:18:28.802 "write": true, 00:18:28.802 "unmap": true, 00:18:28.802 "flush": true, 00:18:28.802 "reset": true, 00:18:28.802 "nvme_admin": false, 00:18:28.802 "nvme_io": false, 00:18:28.802 "nvme_io_md": false, 00:18:28.802 "write_zeroes": true, 00:18:28.802 "zcopy": false, 00:18:28.802 "get_zone_info": false, 00:18:28.802 "zone_management": false, 00:18:28.802 "zone_append": false, 00:18:28.802 "compare": false, 00:18:28.802 "compare_and_write": false, 00:18:28.802 "abort": false, 00:18:28.802 "seek_hole": false, 00:18:28.802 "seek_data": false, 00:18:28.802 "copy": false, 00:18:28.802 "nvme_iov_md": false 00:18:28.802 }, 00:18:28.802 "memory_domains": [ 00:18:28.802 { 00:18:28.802 "dma_device_id": "system", 00:18:28.802 "dma_device_type": 1 00:18:28.802 }, 00:18:28.802 { 00:18:28.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.802 "dma_device_type": 2 00:18:28.802 }, 00:18:28.802 { 00:18:28.802 "dma_device_id": "system", 00:18:28.802 "dma_device_type": 1 00:18:28.802 }, 00:18:28.802 { 00:18:28.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.802 "dma_device_type": 2 00:18:28.802 }, 00:18:28.802 { 00:18:28.802 "dma_device_id": "system", 00:18:28.802 "dma_device_type": 1 00:18:28.802 }, 00:18:28.802 { 00:18:28.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.802 "dma_device_type": 2 00:18:28.802 }, 00:18:28.802 { 00:18:28.802 "dma_device_id": "system", 00:18:28.802 "dma_device_type": 1 00:18:28.802 }, 00:18:28.802 { 00:18:28.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.802 "dma_device_type": 2 00:18:28.802 } 00:18:28.802 ], 00:18:28.802 "driver_specific": { 00:18:28.802 "raid": { 00:18:28.802 "uuid": "c387cbb9-5e4e-4f66-9816-bc894a2815da", 00:18:28.802 "strip_size_kb": 64, 00:18:28.802 "state": "online", 00:18:28.802 "raid_level": "concat", 00:18:28.802 "superblock": true, 00:18:28.802 "num_base_bdevs": 4, 00:18:28.802 "num_base_bdevs_discovered": 4, 00:18:28.802 "num_base_bdevs_operational": 4, 00:18:28.802 "base_bdevs_list": [ 00:18:28.803 { 00:18:28.803 "name": "pt1", 00:18:28.803 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:28.803 "is_configured": true, 00:18:28.803 "data_offset": 2048, 00:18:28.803 "data_size": 63488 00:18:28.803 }, 00:18:28.803 { 00:18:28.803 "name": "pt2", 00:18:28.803 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:28.803 "is_configured": true, 00:18:28.803 "data_offset": 2048, 00:18:28.803 "data_size": 63488 00:18:28.803 }, 00:18:28.803 { 00:18:28.803 "name": "pt3", 00:18:28.803 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:28.803 "is_configured": true, 00:18:28.803 "data_offset": 2048, 00:18:28.803 "data_size": 63488 00:18:28.803 }, 00:18:28.803 { 00:18:28.803 "name": "pt4", 00:18:28.803 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:28.803 "is_configured": true, 00:18:28.803 "data_offset": 2048, 00:18:28.803 "data_size": 63488 00:18:28.803 } 00:18:28.803 ] 00:18:28.803 } 00:18:28.803 } 00:18:28.803 }' 00:18:28.803 17:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:28.803 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:28.803 pt2 00:18:28.803 pt3 00:18:28.803 pt4' 00:18:28.803 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:28.803 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:28.803 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:29.063 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:29.063 "name": "pt1", 00:18:29.063 "aliases": [ 00:18:29.063 "00000000-0000-0000-0000-000000000001" 00:18:29.063 ], 00:18:29.063 "product_name": "passthru", 00:18:29.063 "block_size": 512, 00:18:29.063 "num_blocks": 65536, 00:18:29.063 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:29.063 "assigned_rate_limits": { 00:18:29.063 "rw_ios_per_sec": 0, 00:18:29.063 "rw_mbytes_per_sec": 0, 00:18:29.063 "r_mbytes_per_sec": 0, 00:18:29.063 "w_mbytes_per_sec": 0 00:18:29.063 }, 00:18:29.063 "claimed": true, 00:18:29.063 "claim_type": "exclusive_write", 00:18:29.063 "zoned": false, 00:18:29.063 "supported_io_types": { 00:18:29.063 "read": true, 00:18:29.063 "write": true, 00:18:29.063 "unmap": true, 00:18:29.063 "flush": true, 00:18:29.063 "reset": true, 00:18:29.063 "nvme_admin": false, 00:18:29.063 "nvme_io": false, 00:18:29.063 "nvme_io_md": false, 00:18:29.063 "write_zeroes": true, 00:18:29.063 "zcopy": true, 00:18:29.063 "get_zone_info": false, 00:18:29.063 "zone_management": false, 00:18:29.063 "zone_append": false, 00:18:29.063 "compare": false, 00:18:29.063 "compare_and_write": false, 00:18:29.063 "abort": true, 00:18:29.063 "seek_hole": false, 00:18:29.063 "seek_data": false, 00:18:29.063 "copy": true, 00:18:29.063 "nvme_iov_md": false 00:18:29.063 }, 00:18:29.063 "memory_domains": [ 00:18:29.063 { 00:18:29.063 "dma_device_id": "system", 00:18:29.063 "dma_device_type": 1 00:18:29.063 }, 00:18:29.063 { 00:18:29.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.063 "dma_device_type": 2 00:18:29.063 } 00:18:29.063 ], 00:18:29.063 "driver_specific": { 00:18:29.063 "passthru": { 00:18:29.063 "name": "pt1", 00:18:29.063 "base_bdev_name": "malloc1" 00:18:29.063 } 00:18:29.063 } 00:18:29.063 }' 00:18:29.063 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.063 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.063 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:29.063 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.063 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:29.322 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:29.582 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:29.582 "name": "pt2", 00:18:29.582 "aliases": [ 00:18:29.582 "00000000-0000-0000-0000-000000000002" 00:18:29.582 ], 00:18:29.582 "product_name": "passthru", 00:18:29.582 "block_size": 512, 00:18:29.582 "num_blocks": 65536, 00:18:29.582 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:29.582 "assigned_rate_limits": { 00:18:29.582 "rw_ios_per_sec": 0, 00:18:29.582 "rw_mbytes_per_sec": 0, 00:18:29.582 "r_mbytes_per_sec": 0, 00:18:29.582 "w_mbytes_per_sec": 0 00:18:29.582 }, 00:18:29.582 "claimed": true, 00:18:29.582 "claim_type": "exclusive_write", 00:18:29.582 "zoned": false, 00:18:29.582 "supported_io_types": { 00:18:29.582 "read": true, 00:18:29.582 "write": true, 00:18:29.582 "unmap": true, 00:18:29.582 "flush": true, 00:18:29.582 "reset": true, 00:18:29.582 "nvme_admin": false, 00:18:29.582 "nvme_io": false, 00:18:29.582 "nvme_io_md": false, 00:18:29.582 "write_zeroes": true, 00:18:29.582 "zcopy": true, 00:18:29.582 "get_zone_info": false, 00:18:29.582 "zone_management": false, 00:18:29.582 "zone_append": false, 00:18:29.582 "compare": false, 00:18:29.582 "compare_and_write": false, 00:18:29.582 "abort": true, 00:18:29.582 "seek_hole": false, 00:18:29.582 "seek_data": false, 00:18:29.582 "copy": true, 00:18:29.582 "nvme_iov_md": false 00:18:29.582 }, 00:18:29.582 "memory_domains": [ 00:18:29.582 { 00:18:29.583 "dma_device_id": "system", 00:18:29.583 "dma_device_type": 1 00:18:29.583 }, 00:18:29.583 { 00:18:29.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.583 "dma_device_type": 2 00:18:29.583 } 00:18:29.583 ], 00:18:29.583 "driver_specific": { 00:18:29.583 "passthru": { 00:18:29.583 "name": "pt2", 00:18:29.583 "base_bdev_name": "malloc2" 00:18:29.583 } 00:18:29.583 } 00:18:29.583 }' 00:18:29.583 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.583 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.583 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:29.583 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.583 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.842 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:29.842 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.842 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.842 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:29.842 17:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.842 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.842 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:29.842 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:29.842 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:29.842 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:30.103 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:30.103 "name": "pt3", 00:18:30.103 "aliases": [ 00:18:30.103 "00000000-0000-0000-0000-000000000003" 00:18:30.103 ], 00:18:30.103 "product_name": "passthru", 00:18:30.103 "block_size": 512, 00:18:30.103 "num_blocks": 65536, 00:18:30.103 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:30.103 "assigned_rate_limits": { 00:18:30.103 "rw_ios_per_sec": 0, 00:18:30.103 "rw_mbytes_per_sec": 0, 00:18:30.103 "r_mbytes_per_sec": 0, 00:18:30.103 "w_mbytes_per_sec": 0 00:18:30.103 }, 00:18:30.103 "claimed": true, 00:18:30.103 "claim_type": "exclusive_write", 00:18:30.103 "zoned": false, 00:18:30.103 "supported_io_types": { 00:18:30.103 "read": true, 00:18:30.103 "write": true, 00:18:30.103 "unmap": true, 00:18:30.103 "flush": true, 00:18:30.103 "reset": true, 00:18:30.103 "nvme_admin": false, 00:18:30.103 "nvme_io": false, 00:18:30.103 "nvme_io_md": false, 00:18:30.103 "write_zeroes": true, 00:18:30.103 "zcopy": true, 00:18:30.103 "get_zone_info": false, 00:18:30.103 "zone_management": false, 00:18:30.103 "zone_append": false, 00:18:30.103 "compare": false, 00:18:30.103 "compare_and_write": false, 00:18:30.103 "abort": true, 00:18:30.103 "seek_hole": false, 00:18:30.103 "seek_data": false, 00:18:30.103 "copy": true, 00:18:30.103 "nvme_iov_md": false 00:18:30.104 }, 00:18:30.104 "memory_domains": [ 00:18:30.104 { 00:18:30.104 "dma_device_id": "system", 00:18:30.104 "dma_device_type": 1 00:18:30.104 }, 00:18:30.104 { 00:18:30.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.104 "dma_device_type": 2 00:18:30.104 } 00:18:30.104 ], 00:18:30.104 "driver_specific": { 00:18:30.104 "passthru": { 00:18:30.104 "name": "pt3", 00:18:30.104 "base_bdev_name": "malloc3" 00:18:30.104 } 00:18:30.104 } 00:18:30.104 }' 00:18:30.104 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.104 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.104 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:30.104 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.104 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:30.363 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:30.624 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:30.624 "name": "pt4", 00:18:30.624 "aliases": [ 00:18:30.624 "00000000-0000-0000-0000-000000000004" 00:18:30.624 ], 00:18:30.624 "product_name": "passthru", 00:18:30.624 "block_size": 512, 00:18:30.624 "num_blocks": 65536, 00:18:30.624 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:30.624 "assigned_rate_limits": { 00:18:30.624 "rw_ios_per_sec": 0, 00:18:30.624 "rw_mbytes_per_sec": 0, 00:18:30.624 "r_mbytes_per_sec": 0, 00:18:30.624 "w_mbytes_per_sec": 0 00:18:30.624 }, 00:18:30.624 "claimed": true, 00:18:30.624 "claim_type": "exclusive_write", 00:18:30.624 "zoned": false, 00:18:30.624 "supported_io_types": { 00:18:30.624 "read": true, 00:18:30.624 "write": true, 00:18:30.624 "unmap": true, 00:18:30.624 "flush": true, 00:18:30.624 "reset": true, 00:18:30.624 "nvme_admin": false, 00:18:30.624 "nvme_io": false, 00:18:30.624 "nvme_io_md": false, 00:18:30.624 "write_zeroes": true, 00:18:30.624 "zcopy": true, 00:18:30.624 "get_zone_info": false, 00:18:30.624 "zone_management": false, 00:18:30.624 "zone_append": false, 00:18:30.624 "compare": false, 00:18:30.624 "compare_and_write": false, 00:18:30.624 "abort": true, 00:18:30.624 "seek_hole": false, 00:18:30.624 "seek_data": false, 00:18:30.624 "copy": true, 00:18:30.624 "nvme_iov_md": false 00:18:30.624 }, 00:18:30.624 "memory_domains": [ 00:18:30.624 { 00:18:30.624 "dma_device_id": "system", 00:18:30.624 "dma_device_type": 1 00:18:30.624 }, 00:18:30.624 { 00:18:30.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.624 "dma_device_type": 2 00:18:30.624 } 00:18:30.624 ], 00:18:30.624 "driver_specific": { 00:18:30.624 "passthru": { 00:18:30.624 "name": "pt4", 00:18:30.624 "base_bdev_name": "malloc4" 00:18:30.624 } 00:18:30.624 } 00:18:30.624 }' 00:18:30.624 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.624 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.624 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:30.624 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.624 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.885 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:30.885 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.885 17:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.885 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:30.885 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.885 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.885 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:30.885 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:30.885 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:31.159 [2024-07-15 17:30:42.283425] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c387cbb9-5e4e-4f66-9816-bc894a2815da '!=' c387cbb9-5e4e-4f66-9816-bc894a2815da ']' 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2833481 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2833481 ']' 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2833481 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2833481 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2833481' 00:18:31.159 killing process with pid 2833481 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2833481 00:18:31.159 [2024-07-15 17:30:42.354744] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:31.159 [2024-07-15 17:30:42.354786] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:31.159 [2024-07-15 17:30:42.354833] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:31.159 [2024-07-15 17:30:42.354839] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x274b840 name raid_bdev1, state offline 00:18:31.159 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2833481 00:18:31.159 [2024-07-15 17:30:42.375193] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:31.447 17:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:31.447 00:18:31.447 real 0m13.667s 00:18:31.447 user 0m25.209s 00:18:31.447 sys 0m2.036s 00:18:31.447 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:31.447 17:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.447 ************************************ 00:18:31.447 END TEST raid_superblock_test 00:18:31.447 ************************************ 00:18:31.447 17:30:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:31.447 17:30:42 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:18:31.447 17:30:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:31.447 17:30:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:31.447 17:30:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:31.447 ************************************ 00:18:31.447 START TEST raid_read_error_test 00:18:31.447 ************************************ 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.TPCAdjLeRU 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2836226 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2836226 /var/tmp/spdk-raid.sock 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2836226 ']' 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:31.447 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:31.447 17:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.447 [2024-07-15 17:30:42.684133] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:18:31.447 [2024-07-15 17:30:42.684259] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2836226 ] 00:18:31.707 [2024-07-15 17:30:42.824722] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:31.707 [2024-07-15 17:30:42.901342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:31.707 [2024-07-15 17:30:42.942806] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:31.707 [2024-07-15 17:30:42.942831] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:31.969 17:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:31.969 17:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:31.969 17:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:31.969 17:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:32.229 BaseBdev1_malloc 00:18:32.229 17:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:32.229 true 00:18:32.229 17:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:32.488 [2024-07-15 17:30:43.689318] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:32.488 [2024-07-15 17:30:43.689348] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:32.488 [2024-07-15 17:30:43.689360] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x247bb50 00:18:32.488 [2024-07-15 17:30:43.689367] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:32.488 [2024-07-15 17:30:43.690688] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:32.488 [2024-07-15 17:30:43.690707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:32.488 BaseBdev1 00:18:32.488 17:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:32.488 17:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:32.748 BaseBdev2_malloc 00:18:32.748 17:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:33.008 true 00:18:33.008 17:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:33.008 [2024-07-15 17:30:44.248640] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:33.008 [2024-07-15 17:30:44.248667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:33.008 [2024-07-15 17:30:44.248679] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x245fea0 00:18:33.008 [2024-07-15 17:30:44.248685] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:33.008 [2024-07-15 17:30:44.249876] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:33.008 [2024-07-15 17:30:44.249900] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:33.008 BaseBdev2 00:18:33.008 17:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:33.008 17:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:33.268 BaseBdev3_malloc 00:18:33.268 17:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:33.529 true 00:18:33.529 17:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:33.529 [2024-07-15 17:30:44.808009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:33.529 [2024-07-15 17:30:44.808035] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:33.529 [2024-07-15 17:30:44.808047] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2463fb0 00:18:33.529 [2024-07-15 17:30:44.808054] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:33.529 [2024-07-15 17:30:44.809230] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:33.529 [2024-07-15 17:30:44.809249] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:33.529 BaseBdev3 00:18:33.529 17:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:33.529 17:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:33.789 BaseBdev4_malloc 00:18:33.789 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:34.049 true 00:18:34.049 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:34.309 [2024-07-15 17:30:45.363300] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:34.309 [2024-07-15 17:30:45.363325] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:34.309 [2024-07-15 17:30:45.363336] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2465980 00:18:34.309 [2024-07-15 17:30:45.363342] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:34.309 [2024-07-15 17:30:45.364515] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:34.309 [2024-07-15 17:30:45.364534] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:34.309 BaseBdev4 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:34.309 [2024-07-15 17:30:45.551811] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:34.309 [2024-07-15 17:30:45.552820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:34.309 [2024-07-15 17:30:45.552872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:34.309 [2024-07-15 17:30:45.552917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:34.309 [2024-07-15 17:30:45.553093] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24654e0 00:18:34.309 [2024-07-15 17:30:45.553100] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:34.309 [2024-07-15 17:30:45.553244] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22c7210 00:18:34.309 [2024-07-15 17:30:45.553360] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24654e0 00:18:34.309 [2024-07-15 17:30:45.553370] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24654e0 00:18:34.309 [2024-07-15 17:30:45.553444] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.309 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:34.569 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:34.569 "name": "raid_bdev1", 00:18:34.569 "uuid": "cb350e12-2134-4cb3-ac72-1f5ca532d441", 00:18:34.569 "strip_size_kb": 64, 00:18:34.569 "state": "online", 00:18:34.569 "raid_level": "concat", 00:18:34.569 "superblock": true, 00:18:34.569 "num_base_bdevs": 4, 00:18:34.569 "num_base_bdevs_discovered": 4, 00:18:34.569 "num_base_bdevs_operational": 4, 00:18:34.569 "base_bdevs_list": [ 00:18:34.569 { 00:18:34.569 "name": "BaseBdev1", 00:18:34.569 "uuid": "49ab55d2-f9d7-568b-8bbb-55b381492146", 00:18:34.569 "is_configured": true, 00:18:34.569 "data_offset": 2048, 00:18:34.569 "data_size": 63488 00:18:34.569 }, 00:18:34.569 { 00:18:34.569 "name": "BaseBdev2", 00:18:34.569 "uuid": "ef6f5759-f27f-5f32-a823-9a5468d42ccd", 00:18:34.569 "is_configured": true, 00:18:34.569 "data_offset": 2048, 00:18:34.569 "data_size": 63488 00:18:34.569 }, 00:18:34.569 { 00:18:34.569 "name": "BaseBdev3", 00:18:34.569 "uuid": "d61e386a-115f-52ad-8cc3-c32823eac796", 00:18:34.569 "is_configured": true, 00:18:34.569 "data_offset": 2048, 00:18:34.569 "data_size": 63488 00:18:34.569 }, 00:18:34.569 { 00:18:34.569 "name": "BaseBdev4", 00:18:34.569 "uuid": "1f0a67ef-d0aa-5f34-ac88-4ae900ae7d01", 00:18:34.569 "is_configured": true, 00:18:34.569 "data_offset": 2048, 00:18:34.569 "data_size": 63488 00:18:34.569 } 00:18:34.569 ] 00:18:34.569 }' 00:18:34.569 17:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:34.569 17:30:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:35.141 17:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:35.141 17:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:35.141 [2024-07-15 17:30:46.402162] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x246b170 00:18:36.080 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.340 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:36.600 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:36.600 "name": "raid_bdev1", 00:18:36.600 "uuid": "cb350e12-2134-4cb3-ac72-1f5ca532d441", 00:18:36.600 "strip_size_kb": 64, 00:18:36.600 "state": "online", 00:18:36.600 "raid_level": "concat", 00:18:36.600 "superblock": true, 00:18:36.600 "num_base_bdevs": 4, 00:18:36.600 "num_base_bdevs_discovered": 4, 00:18:36.600 "num_base_bdevs_operational": 4, 00:18:36.600 "base_bdevs_list": [ 00:18:36.600 { 00:18:36.600 "name": "BaseBdev1", 00:18:36.600 "uuid": "49ab55d2-f9d7-568b-8bbb-55b381492146", 00:18:36.600 "is_configured": true, 00:18:36.600 "data_offset": 2048, 00:18:36.600 "data_size": 63488 00:18:36.600 }, 00:18:36.600 { 00:18:36.600 "name": "BaseBdev2", 00:18:36.600 "uuid": "ef6f5759-f27f-5f32-a823-9a5468d42ccd", 00:18:36.600 "is_configured": true, 00:18:36.600 "data_offset": 2048, 00:18:36.600 "data_size": 63488 00:18:36.600 }, 00:18:36.600 { 00:18:36.600 "name": "BaseBdev3", 00:18:36.600 "uuid": "d61e386a-115f-52ad-8cc3-c32823eac796", 00:18:36.600 "is_configured": true, 00:18:36.600 "data_offset": 2048, 00:18:36.600 "data_size": 63488 00:18:36.600 }, 00:18:36.600 { 00:18:36.600 "name": "BaseBdev4", 00:18:36.600 "uuid": "1f0a67ef-d0aa-5f34-ac88-4ae900ae7d01", 00:18:36.600 "is_configured": true, 00:18:36.600 "data_offset": 2048, 00:18:36.600 "data_size": 63488 00:18:36.600 } 00:18:36.600 ] 00:18:36.600 }' 00:18:36.600 17:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:36.600 17:30:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:37.169 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:37.169 [2024-07-15 17:30:48.436151] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:37.169 [2024-07-15 17:30:48.436177] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:37.169 [2024-07-15 17:30:48.438769] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:37.169 [2024-07-15 17:30:48.438797] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:37.169 [2024-07-15 17:30:48.438826] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:37.169 [2024-07-15 17:30:48.438831] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24654e0 name raid_bdev1, state offline 00:18:37.169 0 00:18:37.169 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2836226 00:18:37.169 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2836226 ']' 00:18:37.169 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2836226 00:18:37.169 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:37.169 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:37.169 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2836226 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2836226' 00:18:37.430 killing process with pid 2836226 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2836226 00:18:37.430 [2024-07-15 17:30:48.504704] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2836226 00:18:37.430 [2024-07-15 17:30:48.521756] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.TPCAdjLeRU 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:18:37.430 00:18:37.430 real 0m6.086s 00:18:37.430 user 0m10.093s 00:18:37.430 sys 0m0.991s 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:37.430 17:30:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:37.430 ************************************ 00:18:37.430 END TEST raid_read_error_test 00:18:37.430 ************************************ 00:18:37.430 17:30:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:37.430 17:30:48 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:18:37.430 17:30:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:37.430 17:30:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:37.430 17:30:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:37.691 ************************************ 00:18:37.691 START TEST raid_write_error_test 00:18:37.691 ************************************ 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:37.691 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.zfyO82ud5L 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2837273 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2837273 /var/tmp/spdk-raid.sock 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2837273 ']' 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:37.692 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:37.692 17:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:37.692 [2024-07-15 17:30:48.843021] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:18:37.692 [2024-07-15 17:30:48.843147] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2837273 ] 00:18:37.692 [2024-07-15 17:30:48.987180] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:37.952 [2024-07-15 17:30:49.062621] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:37.952 [2024-07-15 17:30:49.101960] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:37.952 [2024-07-15 17:30:49.101985] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:38.521 17:30:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:38.521 17:30:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:38.521 17:30:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:38.521 17:30:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:38.781 BaseBdev1_malloc 00:18:38.781 17:30:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:38.781 true 00:18:38.781 17:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:39.041 [2024-07-15 17:30:50.204486] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:39.041 [2024-07-15 17:30:50.204520] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:39.041 [2024-07-15 17:30:50.204532] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ddab50 00:18:39.041 [2024-07-15 17:30:50.204538] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:39.041 [2024-07-15 17:30:50.205909] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:39.041 [2024-07-15 17:30:50.205928] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:39.041 BaseBdev1 00:18:39.041 17:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:39.041 17:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:39.302 BaseBdev2_malloc 00:18:39.302 17:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:39.302 true 00:18:39.561 17:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:39.561 [2024-07-15 17:30:50.779843] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:39.561 [2024-07-15 17:30:50.779872] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:39.561 [2024-07-15 17:30:50.779883] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dbeea0 00:18:39.561 [2024-07-15 17:30:50.779890] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:39.561 [2024-07-15 17:30:50.781086] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:39.561 [2024-07-15 17:30:50.781105] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:39.561 BaseBdev2 00:18:39.561 17:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:39.561 17:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:39.822 BaseBdev3_malloc 00:18:39.822 17:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:40.082 true 00:18:40.082 17:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:40.082 [2024-07-15 17:30:51.355193] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:40.082 [2024-07-15 17:30:51.355217] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:40.082 [2024-07-15 17:30:51.355228] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dc2fb0 00:18:40.082 [2024-07-15 17:30:51.355234] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:40.082 [2024-07-15 17:30:51.356399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:40.082 [2024-07-15 17:30:51.356417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:40.082 BaseBdev3 00:18:40.343 17:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:40.343 17:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:40.343 BaseBdev4_malloc 00:18:40.343 17:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:40.603 true 00:18:40.603 17:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:40.863 [2024-07-15 17:30:51.946475] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:40.863 [2024-07-15 17:30:51.946503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:40.863 [2024-07-15 17:30:51.946514] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dc4980 00:18:40.863 [2024-07-15 17:30:51.946520] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:40.863 [2024-07-15 17:30:51.947723] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:40.863 [2024-07-15 17:30:51.947742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:40.863 BaseBdev4 00:18:40.863 17:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:40.863 [2024-07-15 17:30:52.138984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:40.863 [2024-07-15 17:30:52.139999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:40.863 [2024-07-15 17:30:52.140051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:40.863 [2024-07-15 17:30:52.140096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:40.863 [2024-07-15 17:30:52.140271] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dc44e0 00:18:40.863 [2024-07-15 17:30:52.140278] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:40.863 [2024-07-15 17:30:52.140423] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c26210 00:18:40.863 [2024-07-15 17:30:52.140540] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dc44e0 00:18:40.863 [2024-07-15 17:30:52.140545] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1dc44e0 00:18:40.863 [2024-07-15 17:30:52.140620] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.863 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:41.122 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.122 "name": "raid_bdev1", 00:18:41.122 "uuid": "00196964-ca13-4466-affd-ec661a303233", 00:18:41.122 "strip_size_kb": 64, 00:18:41.122 "state": "online", 00:18:41.122 "raid_level": "concat", 00:18:41.122 "superblock": true, 00:18:41.122 "num_base_bdevs": 4, 00:18:41.122 "num_base_bdevs_discovered": 4, 00:18:41.122 "num_base_bdevs_operational": 4, 00:18:41.122 "base_bdevs_list": [ 00:18:41.122 { 00:18:41.122 "name": "BaseBdev1", 00:18:41.122 "uuid": "e2ffe5df-7a01-5215-ab81-5440b4768d1f", 00:18:41.122 "is_configured": true, 00:18:41.122 "data_offset": 2048, 00:18:41.122 "data_size": 63488 00:18:41.122 }, 00:18:41.122 { 00:18:41.122 "name": "BaseBdev2", 00:18:41.122 "uuid": "7de9687d-2aa6-5dfa-b8c1-59a74bd90513", 00:18:41.122 "is_configured": true, 00:18:41.122 "data_offset": 2048, 00:18:41.122 "data_size": 63488 00:18:41.122 }, 00:18:41.122 { 00:18:41.122 "name": "BaseBdev3", 00:18:41.122 "uuid": "0cd5f466-74f1-5b27-81d3-bc58594864ca", 00:18:41.122 "is_configured": true, 00:18:41.122 "data_offset": 2048, 00:18:41.122 "data_size": 63488 00:18:41.122 }, 00:18:41.122 { 00:18:41.122 "name": "BaseBdev4", 00:18:41.123 "uuid": "e0844c23-acf4-5791-b42a-abb597a21f2e", 00:18:41.123 "is_configured": true, 00:18:41.123 "data_offset": 2048, 00:18:41.123 "data_size": 63488 00:18:41.123 } 00:18:41.123 ] 00:18:41.123 }' 00:18:41.123 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.123 17:30:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:41.692 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:41.692 17:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:41.692 [2024-07-15 17:30:52.933191] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dca170 00:18:42.631 17:30:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:42.891 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:42.891 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.892 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.152 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.152 "name": "raid_bdev1", 00:18:43.152 "uuid": "00196964-ca13-4466-affd-ec661a303233", 00:18:43.152 "strip_size_kb": 64, 00:18:43.152 "state": "online", 00:18:43.152 "raid_level": "concat", 00:18:43.152 "superblock": true, 00:18:43.152 "num_base_bdevs": 4, 00:18:43.152 "num_base_bdevs_discovered": 4, 00:18:43.152 "num_base_bdevs_operational": 4, 00:18:43.152 "base_bdevs_list": [ 00:18:43.152 { 00:18:43.152 "name": "BaseBdev1", 00:18:43.152 "uuid": "e2ffe5df-7a01-5215-ab81-5440b4768d1f", 00:18:43.152 "is_configured": true, 00:18:43.152 "data_offset": 2048, 00:18:43.152 "data_size": 63488 00:18:43.152 }, 00:18:43.152 { 00:18:43.152 "name": "BaseBdev2", 00:18:43.152 "uuid": "7de9687d-2aa6-5dfa-b8c1-59a74bd90513", 00:18:43.152 "is_configured": true, 00:18:43.152 "data_offset": 2048, 00:18:43.152 "data_size": 63488 00:18:43.152 }, 00:18:43.152 { 00:18:43.152 "name": "BaseBdev3", 00:18:43.152 "uuid": "0cd5f466-74f1-5b27-81d3-bc58594864ca", 00:18:43.152 "is_configured": true, 00:18:43.152 "data_offset": 2048, 00:18:43.152 "data_size": 63488 00:18:43.152 }, 00:18:43.152 { 00:18:43.152 "name": "BaseBdev4", 00:18:43.152 "uuid": "e0844c23-acf4-5791-b42a-abb597a21f2e", 00:18:43.152 "is_configured": true, 00:18:43.152 "data_offset": 2048, 00:18:43.152 "data_size": 63488 00:18:43.152 } 00:18:43.152 ] 00:18:43.152 }' 00:18:43.152 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.152 17:30:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:43.721 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:43.721 [2024-07-15 17:30:54.940644] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:43.721 [2024-07-15 17:30:54.940671] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:43.721 [2024-07-15 17:30:54.943254] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:43.721 [2024-07-15 17:30:54.943284] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:43.721 [2024-07-15 17:30:54.943316] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:43.721 [2024-07-15 17:30:54.943322] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dc44e0 name raid_bdev1, state offline 00:18:43.721 0 00:18:43.721 17:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2837273 00:18:43.721 17:30:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2837273 ']' 00:18:43.721 17:30:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2837273 00:18:43.721 17:30:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:43.721 17:30:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:43.721 17:30:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2837273 00:18:43.721 17:30:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:43.721 17:30:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:43.721 17:30:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2837273' 00:18:43.721 killing process with pid 2837273 00:18:43.721 17:30:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2837273 00:18:43.721 [2024-07-15 17:30:55.008270] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:43.721 17:30:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2837273 00:18:43.982 [2024-07-15 17:30:55.025316] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.zfyO82ud5L 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:18:43.982 00:18:43.982 real 0m6.435s 00:18:43.982 user 0m10.294s 00:18:43.982 sys 0m0.959s 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:43.982 17:30:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:43.982 ************************************ 00:18:43.982 END TEST raid_write_error_test 00:18:43.982 ************************************ 00:18:43.982 17:30:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:43.982 17:30:55 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:18:43.982 17:30:55 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:18:43.982 17:30:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:43.982 17:30:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:43.982 17:30:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:43.982 ************************************ 00:18:43.982 START TEST raid_state_function_test 00:18:43.982 ************************************ 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2838372 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2838372' 00:18:43.982 Process raid pid: 2838372 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2838372 /var/tmp/spdk-raid.sock 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2838372 ']' 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:43.982 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:43.982 17:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:44.242 [2024-07-15 17:30:55.304538] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:18:44.242 [2024-07-15 17:30:55.304593] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:44.242 [2024-07-15 17:30:55.395842] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:44.242 [2024-07-15 17:30:55.471081] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:44.242 [2024-07-15 17:30:55.517141] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:44.242 [2024-07-15 17:30:55.517167] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:44.813 [2024-07-15 17:30:55.976927] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:44.813 [2024-07-15 17:30:55.976957] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:44.813 [2024-07-15 17:30:55.976963] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:44.813 [2024-07-15 17:30:55.976969] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:44.813 [2024-07-15 17:30:55.976974] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:44.813 [2024-07-15 17:30:55.976979] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:44.813 [2024-07-15 17:30:55.976984] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:44.813 [2024-07-15 17:30:55.976989] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.813 17:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:45.073 17:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.073 "name": "Existed_Raid", 00:18:45.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.073 "strip_size_kb": 0, 00:18:45.073 "state": "configuring", 00:18:45.073 "raid_level": "raid1", 00:18:45.073 "superblock": false, 00:18:45.073 "num_base_bdevs": 4, 00:18:45.073 "num_base_bdevs_discovered": 0, 00:18:45.073 "num_base_bdevs_operational": 4, 00:18:45.073 "base_bdevs_list": [ 00:18:45.073 { 00:18:45.073 "name": "BaseBdev1", 00:18:45.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.073 "is_configured": false, 00:18:45.073 "data_offset": 0, 00:18:45.073 "data_size": 0 00:18:45.074 }, 00:18:45.074 { 00:18:45.074 "name": "BaseBdev2", 00:18:45.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.074 "is_configured": false, 00:18:45.074 "data_offset": 0, 00:18:45.074 "data_size": 0 00:18:45.074 }, 00:18:45.074 { 00:18:45.074 "name": "BaseBdev3", 00:18:45.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.074 "is_configured": false, 00:18:45.074 "data_offset": 0, 00:18:45.074 "data_size": 0 00:18:45.074 }, 00:18:45.074 { 00:18:45.074 "name": "BaseBdev4", 00:18:45.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.074 "is_configured": false, 00:18:45.074 "data_offset": 0, 00:18:45.074 "data_size": 0 00:18:45.074 } 00:18:45.074 ] 00:18:45.074 }' 00:18:45.074 17:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.074 17:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.644 17:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:45.644 [2024-07-15 17:30:56.887113] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:45.644 [2024-07-15 17:30:56.887128] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f126f0 name Existed_Raid, state configuring 00:18:45.644 17:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:45.905 [2024-07-15 17:30:57.079620] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:45.905 [2024-07-15 17:30:57.079637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:45.905 [2024-07-15 17:30:57.079642] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:45.905 [2024-07-15 17:30:57.079648] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:45.905 [2024-07-15 17:30:57.079652] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:45.905 [2024-07-15 17:30:57.079658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:45.905 [2024-07-15 17:30:57.079662] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:45.905 [2024-07-15 17:30:57.079667] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:45.905 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:46.207 [2024-07-15 17:30:57.262688] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:46.207 BaseBdev1 00:18:46.207 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:46.207 17:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:46.207 17:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:46.207 17:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:46.207 17:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:46.207 17:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:46.208 17:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:46.208 17:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:46.487 [ 00:18:46.487 { 00:18:46.487 "name": "BaseBdev1", 00:18:46.487 "aliases": [ 00:18:46.487 "083e4811-f8be-4718-9c3d-15bbc06473cb" 00:18:46.487 ], 00:18:46.487 "product_name": "Malloc disk", 00:18:46.487 "block_size": 512, 00:18:46.487 "num_blocks": 65536, 00:18:46.487 "uuid": "083e4811-f8be-4718-9c3d-15bbc06473cb", 00:18:46.487 "assigned_rate_limits": { 00:18:46.487 "rw_ios_per_sec": 0, 00:18:46.487 "rw_mbytes_per_sec": 0, 00:18:46.487 "r_mbytes_per_sec": 0, 00:18:46.487 "w_mbytes_per_sec": 0 00:18:46.487 }, 00:18:46.487 "claimed": true, 00:18:46.487 "claim_type": "exclusive_write", 00:18:46.487 "zoned": false, 00:18:46.487 "supported_io_types": { 00:18:46.487 "read": true, 00:18:46.487 "write": true, 00:18:46.487 "unmap": true, 00:18:46.487 "flush": true, 00:18:46.487 "reset": true, 00:18:46.487 "nvme_admin": false, 00:18:46.487 "nvme_io": false, 00:18:46.487 "nvme_io_md": false, 00:18:46.487 "write_zeroes": true, 00:18:46.487 "zcopy": true, 00:18:46.487 "get_zone_info": false, 00:18:46.487 "zone_management": false, 00:18:46.487 "zone_append": false, 00:18:46.487 "compare": false, 00:18:46.487 "compare_and_write": false, 00:18:46.487 "abort": true, 00:18:46.487 "seek_hole": false, 00:18:46.488 "seek_data": false, 00:18:46.488 "copy": true, 00:18:46.488 "nvme_iov_md": false 00:18:46.488 }, 00:18:46.488 "memory_domains": [ 00:18:46.488 { 00:18:46.488 "dma_device_id": "system", 00:18:46.488 "dma_device_type": 1 00:18:46.488 }, 00:18:46.488 { 00:18:46.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.488 "dma_device_type": 2 00:18:46.488 } 00:18:46.488 ], 00:18:46.488 "driver_specific": {} 00:18:46.488 } 00:18:46.488 ] 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.488 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.748 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.748 "name": "Existed_Raid", 00:18:46.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.748 "strip_size_kb": 0, 00:18:46.748 "state": "configuring", 00:18:46.748 "raid_level": "raid1", 00:18:46.748 "superblock": false, 00:18:46.748 "num_base_bdevs": 4, 00:18:46.748 "num_base_bdevs_discovered": 1, 00:18:46.748 "num_base_bdevs_operational": 4, 00:18:46.748 "base_bdevs_list": [ 00:18:46.748 { 00:18:46.748 "name": "BaseBdev1", 00:18:46.748 "uuid": "083e4811-f8be-4718-9c3d-15bbc06473cb", 00:18:46.748 "is_configured": true, 00:18:46.748 "data_offset": 0, 00:18:46.748 "data_size": 65536 00:18:46.748 }, 00:18:46.748 { 00:18:46.748 "name": "BaseBdev2", 00:18:46.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.748 "is_configured": false, 00:18:46.748 "data_offset": 0, 00:18:46.748 "data_size": 0 00:18:46.748 }, 00:18:46.748 { 00:18:46.748 "name": "BaseBdev3", 00:18:46.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.748 "is_configured": false, 00:18:46.748 "data_offset": 0, 00:18:46.748 "data_size": 0 00:18:46.748 }, 00:18:46.748 { 00:18:46.748 "name": "BaseBdev4", 00:18:46.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.748 "is_configured": false, 00:18:46.748 "data_offset": 0, 00:18:46.748 "data_size": 0 00:18:46.748 } 00:18:46.748 ] 00:18:46.748 }' 00:18:46.748 17:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.748 17:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.318 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:47.318 [2024-07-15 17:30:58.582011] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:47.318 [2024-07-15 17:30:58.582037] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f11f60 name Existed_Raid, state configuring 00:18:47.318 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:47.578 [2024-07-15 17:30:58.770511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:47.578 [2024-07-15 17:30:58.771606] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:47.578 [2024-07-15 17:30:58.771628] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:47.578 [2024-07-15 17:30:58.771634] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:47.578 [2024-07-15 17:30:58.771640] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:47.578 [2024-07-15 17:30:58.771644] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:47.578 [2024-07-15 17:30:58.771654] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.578 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:47.839 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.839 "name": "Existed_Raid", 00:18:47.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.839 "strip_size_kb": 0, 00:18:47.839 "state": "configuring", 00:18:47.839 "raid_level": "raid1", 00:18:47.839 "superblock": false, 00:18:47.839 "num_base_bdevs": 4, 00:18:47.839 "num_base_bdevs_discovered": 1, 00:18:47.839 "num_base_bdevs_operational": 4, 00:18:47.839 "base_bdevs_list": [ 00:18:47.839 { 00:18:47.839 "name": "BaseBdev1", 00:18:47.839 "uuid": "083e4811-f8be-4718-9c3d-15bbc06473cb", 00:18:47.839 "is_configured": true, 00:18:47.839 "data_offset": 0, 00:18:47.839 "data_size": 65536 00:18:47.839 }, 00:18:47.839 { 00:18:47.839 "name": "BaseBdev2", 00:18:47.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.839 "is_configured": false, 00:18:47.839 "data_offset": 0, 00:18:47.839 "data_size": 0 00:18:47.839 }, 00:18:47.839 { 00:18:47.839 "name": "BaseBdev3", 00:18:47.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.839 "is_configured": false, 00:18:47.839 "data_offset": 0, 00:18:47.839 "data_size": 0 00:18:47.839 }, 00:18:47.839 { 00:18:47.839 "name": "BaseBdev4", 00:18:47.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.839 "is_configured": false, 00:18:47.839 "data_offset": 0, 00:18:47.839 "data_size": 0 00:18:47.839 } 00:18:47.839 ] 00:18:47.839 }' 00:18:47.839 17:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.839 17:30:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:48.408 17:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:48.667 [2024-07-15 17:30:59.718012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:48.667 BaseBdev2 00:18:48.667 17:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:48.667 17:30:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:48.667 17:30:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:48.667 17:30:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:48.667 17:30:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:48.667 17:30:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:48.667 17:30:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:48.667 17:30:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:48.926 [ 00:18:48.926 { 00:18:48.926 "name": "BaseBdev2", 00:18:48.926 "aliases": [ 00:18:48.926 "6b51255e-7efc-4917-9962-e8a1675e75bb" 00:18:48.926 ], 00:18:48.926 "product_name": "Malloc disk", 00:18:48.926 "block_size": 512, 00:18:48.926 "num_blocks": 65536, 00:18:48.926 "uuid": "6b51255e-7efc-4917-9962-e8a1675e75bb", 00:18:48.926 "assigned_rate_limits": { 00:18:48.926 "rw_ios_per_sec": 0, 00:18:48.926 "rw_mbytes_per_sec": 0, 00:18:48.926 "r_mbytes_per_sec": 0, 00:18:48.926 "w_mbytes_per_sec": 0 00:18:48.926 }, 00:18:48.926 "claimed": true, 00:18:48.926 "claim_type": "exclusive_write", 00:18:48.926 "zoned": false, 00:18:48.926 "supported_io_types": { 00:18:48.926 "read": true, 00:18:48.926 "write": true, 00:18:48.926 "unmap": true, 00:18:48.926 "flush": true, 00:18:48.926 "reset": true, 00:18:48.926 "nvme_admin": false, 00:18:48.926 "nvme_io": false, 00:18:48.926 "nvme_io_md": false, 00:18:48.926 "write_zeroes": true, 00:18:48.926 "zcopy": true, 00:18:48.926 "get_zone_info": false, 00:18:48.926 "zone_management": false, 00:18:48.926 "zone_append": false, 00:18:48.926 "compare": false, 00:18:48.926 "compare_and_write": false, 00:18:48.926 "abort": true, 00:18:48.926 "seek_hole": false, 00:18:48.926 "seek_data": false, 00:18:48.926 "copy": true, 00:18:48.926 "nvme_iov_md": false 00:18:48.926 }, 00:18:48.926 "memory_domains": [ 00:18:48.926 { 00:18:48.926 "dma_device_id": "system", 00:18:48.926 "dma_device_type": 1 00:18:48.926 }, 00:18:48.926 { 00:18:48.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.926 "dma_device_type": 2 00:18:48.926 } 00:18:48.926 ], 00:18:48.926 "driver_specific": {} 00:18:48.926 } 00:18:48.926 ] 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.926 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:49.186 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:49.186 "name": "Existed_Raid", 00:18:49.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.186 "strip_size_kb": 0, 00:18:49.186 "state": "configuring", 00:18:49.186 "raid_level": "raid1", 00:18:49.186 "superblock": false, 00:18:49.186 "num_base_bdevs": 4, 00:18:49.186 "num_base_bdevs_discovered": 2, 00:18:49.186 "num_base_bdevs_operational": 4, 00:18:49.186 "base_bdevs_list": [ 00:18:49.186 { 00:18:49.186 "name": "BaseBdev1", 00:18:49.186 "uuid": "083e4811-f8be-4718-9c3d-15bbc06473cb", 00:18:49.186 "is_configured": true, 00:18:49.186 "data_offset": 0, 00:18:49.186 "data_size": 65536 00:18:49.186 }, 00:18:49.186 { 00:18:49.186 "name": "BaseBdev2", 00:18:49.186 "uuid": "6b51255e-7efc-4917-9962-e8a1675e75bb", 00:18:49.186 "is_configured": true, 00:18:49.186 "data_offset": 0, 00:18:49.186 "data_size": 65536 00:18:49.186 }, 00:18:49.186 { 00:18:49.186 "name": "BaseBdev3", 00:18:49.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.186 "is_configured": false, 00:18:49.186 "data_offset": 0, 00:18:49.186 "data_size": 0 00:18:49.186 }, 00:18:49.186 { 00:18:49.186 "name": "BaseBdev4", 00:18:49.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.186 "is_configured": false, 00:18:49.186 "data_offset": 0, 00:18:49.186 "data_size": 0 00:18:49.186 } 00:18:49.186 ] 00:18:49.186 }' 00:18:49.186 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:49.186 17:31:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.754 17:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:49.754 [2024-07-15 17:31:01.014164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:49.754 BaseBdev3 00:18:49.754 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:49.754 17:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:49.754 17:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:49.754 17:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:49.754 17:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:49.754 17:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:49.754 17:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:50.013 17:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:50.272 [ 00:18:50.272 { 00:18:50.272 "name": "BaseBdev3", 00:18:50.272 "aliases": [ 00:18:50.272 "10c5e601-ceab-4925-b845-4dcc3645869e" 00:18:50.272 ], 00:18:50.272 "product_name": "Malloc disk", 00:18:50.272 "block_size": 512, 00:18:50.272 "num_blocks": 65536, 00:18:50.272 "uuid": "10c5e601-ceab-4925-b845-4dcc3645869e", 00:18:50.272 "assigned_rate_limits": { 00:18:50.272 "rw_ios_per_sec": 0, 00:18:50.272 "rw_mbytes_per_sec": 0, 00:18:50.272 "r_mbytes_per_sec": 0, 00:18:50.272 "w_mbytes_per_sec": 0 00:18:50.272 }, 00:18:50.272 "claimed": true, 00:18:50.272 "claim_type": "exclusive_write", 00:18:50.272 "zoned": false, 00:18:50.272 "supported_io_types": { 00:18:50.272 "read": true, 00:18:50.272 "write": true, 00:18:50.272 "unmap": true, 00:18:50.272 "flush": true, 00:18:50.272 "reset": true, 00:18:50.272 "nvme_admin": false, 00:18:50.272 "nvme_io": false, 00:18:50.272 "nvme_io_md": false, 00:18:50.272 "write_zeroes": true, 00:18:50.272 "zcopy": true, 00:18:50.272 "get_zone_info": false, 00:18:50.272 "zone_management": false, 00:18:50.272 "zone_append": false, 00:18:50.272 "compare": false, 00:18:50.272 "compare_and_write": false, 00:18:50.272 "abort": true, 00:18:50.272 "seek_hole": false, 00:18:50.272 "seek_data": false, 00:18:50.272 "copy": true, 00:18:50.272 "nvme_iov_md": false 00:18:50.272 }, 00:18:50.272 "memory_domains": [ 00:18:50.272 { 00:18:50.272 "dma_device_id": "system", 00:18:50.272 "dma_device_type": 1 00:18:50.272 }, 00:18:50.272 { 00:18:50.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.272 "dma_device_type": 2 00:18:50.272 } 00:18:50.272 ], 00:18:50.272 "driver_specific": {} 00:18:50.272 } 00:18:50.272 ] 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.272 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:50.531 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.531 "name": "Existed_Raid", 00:18:50.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.531 "strip_size_kb": 0, 00:18:50.531 "state": "configuring", 00:18:50.531 "raid_level": "raid1", 00:18:50.531 "superblock": false, 00:18:50.531 "num_base_bdevs": 4, 00:18:50.531 "num_base_bdevs_discovered": 3, 00:18:50.531 "num_base_bdevs_operational": 4, 00:18:50.531 "base_bdevs_list": [ 00:18:50.531 { 00:18:50.531 "name": "BaseBdev1", 00:18:50.531 "uuid": "083e4811-f8be-4718-9c3d-15bbc06473cb", 00:18:50.531 "is_configured": true, 00:18:50.531 "data_offset": 0, 00:18:50.531 "data_size": 65536 00:18:50.531 }, 00:18:50.531 { 00:18:50.531 "name": "BaseBdev2", 00:18:50.531 "uuid": "6b51255e-7efc-4917-9962-e8a1675e75bb", 00:18:50.531 "is_configured": true, 00:18:50.531 "data_offset": 0, 00:18:50.532 "data_size": 65536 00:18:50.532 }, 00:18:50.532 { 00:18:50.532 "name": "BaseBdev3", 00:18:50.532 "uuid": "10c5e601-ceab-4925-b845-4dcc3645869e", 00:18:50.532 "is_configured": true, 00:18:50.532 "data_offset": 0, 00:18:50.532 "data_size": 65536 00:18:50.532 }, 00:18:50.532 { 00:18:50.532 "name": "BaseBdev4", 00:18:50.532 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.532 "is_configured": false, 00:18:50.532 "data_offset": 0, 00:18:50.532 "data_size": 0 00:18:50.532 } 00:18:50.532 ] 00:18:50.532 }' 00:18:50.532 17:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.532 17:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.100 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:51.100 [2024-07-15 17:31:02.306472] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:51.100 [2024-07-15 17:31:02.306497] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f12fc0 00:18:51.100 [2024-07-15 17:31:02.306502] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:51.100 [2024-07-15 17:31:02.306663] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f12c00 00:18:51.100 [2024-07-15 17:31:02.306771] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f12fc0 00:18:51.100 [2024-07-15 17:31:02.306777] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f12fc0 00:18:51.100 [2024-07-15 17:31:02.306894] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:51.100 BaseBdev4 00:18:51.100 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:51.100 17:31:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:51.100 17:31:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:51.100 17:31:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:51.100 17:31:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:51.100 17:31:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:51.100 17:31:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:51.360 17:31:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:51.621 [ 00:18:51.621 { 00:18:51.621 "name": "BaseBdev4", 00:18:51.621 "aliases": [ 00:18:51.621 "7ddafd61-eb66-4a3e-98fb-a350100ef1e0" 00:18:51.621 ], 00:18:51.621 "product_name": "Malloc disk", 00:18:51.621 "block_size": 512, 00:18:51.621 "num_blocks": 65536, 00:18:51.621 "uuid": "7ddafd61-eb66-4a3e-98fb-a350100ef1e0", 00:18:51.621 "assigned_rate_limits": { 00:18:51.621 "rw_ios_per_sec": 0, 00:18:51.621 "rw_mbytes_per_sec": 0, 00:18:51.621 "r_mbytes_per_sec": 0, 00:18:51.621 "w_mbytes_per_sec": 0 00:18:51.621 }, 00:18:51.621 "claimed": true, 00:18:51.621 "claim_type": "exclusive_write", 00:18:51.621 "zoned": false, 00:18:51.621 "supported_io_types": { 00:18:51.621 "read": true, 00:18:51.621 "write": true, 00:18:51.621 "unmap": true, 00:18:51.621 "flush": true, 00:18:51.621 "reset": true, 00:18:51.621 "nvme_admin": false, 00:18:51.621 "nvme_io": false, 00:18:51.621 "nvme_io_md": false, 00:18:51.621 "write_zeroes": true, 00:18:51.621 "zcopy": true, 00:18:51.621 "get_zone_info": false, 00:18:51.621 "zone_management": false, 00:18:51.621 "zone_append": false, 00:18:51.621 "compare": false, 00:18:51.621 "compare_and_write": false, 00:18:51.621 "abort": true, 00:18:51.621 "seek_hole": false, 00:18:51.621 "seek_data": false, 00:18:51.621 "copy": true, 00:18:51.621 "nvme_iov_md": false 00:18:51.621 }, 00:18:51.621 "memory_domains": [ 00:18:51.621 { 00:18:51.621 "dma_device_id": "system", 00:18:51.621 "dma_device_type": 1 00:18:51.621 }, 00:18:51.621 { 00:18:51.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.621 "dma_device_type": 2 00:18:51.621 } 00:18:51.621 ], 00:18:51.621 "driver_specific": {} 00:18:51.621 } 00:18:51.621 ] 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:51.621 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.621 "name": "Existed_Raid", 00:18:51.621 "uuid": "987a5b8b-6d10-461f-ba57-a9f90c70d095", 00:18:51.621 "strip_size_kb": 0, 00:18:51.621 "state": "online", 00:18:51.621 "raid_level": "raid1", 00:18:51.621 "superblock": false, 00:18:51.621 "num_base_bdevs": 4, 00:18:51.621 "num_base_bdevs_discovered": 4, 00:18:51.621 "num_base_bdevs_operational": 4, 00:18:51.621 "base_bdevs_list": [ 00:18:51.621 { 00:18:51.621 "name": "BaseBdev1", 00:18:51.621 "uuid": "083e4811-f8be-4718-9c3d-15bbc06473cb", 00:18:51.621 "is_configured": true, 00:18:51.622 "data_offset": 0, 00:18:51.622 "data_size": 65536 00:18:51.622 }, 00:18:51.622 { 00:18:51.622 "name": "BaseBdev2", 00:18:51.622 "uuid": "6b51255e-7efc-4917-9962-e8a1675e75bb", 00:18:51.622 "is_configured": true, 00:18:51.622 "data_offset": 0, 00:18:51.622 "data_size": 65536 00:18:51.622 }, 00:18:51.622 { 00:18:51.622 "name": "BaseBdev3", 00:18:51.622 "uuid": "10c5e601-ceab-4925-b845-4dcc3645869e", 00:18:51.622 "is_configured": true, 00:18:51.622 "data_offset": 0, 00:18:51.622 "data_size": 65536 00:18:51.622 }, 00:18:51.622 { 00:18:51.622 "name": "BaseBdev4", 00:18:51.622 "uuid": "7ddafd61-eb66-4a3e-98fb-a350100ef1e0", 00:18:51.622 "is_configured": true, 00:18:51.622 "data_offset": 0, 00:18:51.622 "data_size": 65536 00:18:51.622 } 00:18:51.622 ] 00:18:51.622 }' 00:18:51.622 17:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.622 17:31:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:52.194 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:52.194 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:52.194 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:52.194 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:52.194 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:52.194 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:52.194 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:52.194 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:52.455 [2024-07-15 17:31:03.626059] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:52.455 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:52.455 "name": "Existed_Raid", 00:18:52.455 "aliases": [ 00:18:52.455 "987a5b8b-6d10-461f-ba57-a9f90c70d095" 00:18:52.455 ], 00:18:52.455 "product_name": "Raid Volume", 00:18:52.455 "block_size": 512, 00:18:52.455 "num_blocks": 65536, 00:18:52.455 "uuid": "987a5b8b-6d10-461f-ba57-a9f90c70d095", 00:18:52.455 "assigned_rate_limits": { 00:18:52.455 "rw_ios_per_sec": 0, 00:18:52.455 "rw_mbytes_per_sec": 0, 00:18:52.455 "r_mbytes_per_sec": 0, 00:18:52.455 "w_mbytes_per_sec": 0 00:18:52.455 }, 00:18:52.455 "claimed": false, 00:18:52.455 "zoned": false, 00:18:52.455 "supported_io_types": { 00:18:52.455 "read": true, 00:18:52.455 "write": true, 00:18:52.455 "unmap": false, 00:18:52.455 "flush": false, 00:18:52.455 "reset": true, 00:18:52.455 "nvme_admin": false, 00:18:52.455 "nvme_io": false, 00:18:52.455 "nvme_io_md": false, 00:18:52.455 "write_zeroes": true, 00:18:52.455 "zcopy": false, 00:18:52.455 "get_zone_info": false, 00:18:52.455 "zone_management": false, 00:18:52.455 "zone_append": false, 00:18:52.455 "compare": false, 00:18:52.455 "compare_and_write": false, 00:18:52.455 "abort": false, 00:18:52.455 "seek_hole": false, 00:18:52.455 "seek_data": false, 00:18:52.455 "copy": false, 00:18:52.455 "nvme_iov_md": false 00:18:52.455 }, 00:18:52.455 "memory_domains": [ 00:18:52.455 { 00:18:52.455 "dma_device_id": "system", 00:18:52.455 "dma_device_type": 1 00:18:52.455 }, 00:18:52.455 { 00:18:52.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.455 "dma_device_type": 2 00:18:52.455 }, 00:18:52.455 { 00:18:52.455 "dma_device_id": "system", 00:18:52.455 "dma_device_type": 1 00:18:52.455 }, 00:18:52.455 { 00:18:52.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.455 "dma_device_type": 2 00:18:52.455 }, 00:18:52.455 { 00:18:52.455 "dma_device_id": "system", 00:18:52.455 "dma_device_type": 1 00:18:52.455 }, 00:18:52.455 { 00:18:52.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.455 "dma_device_type": 2 00:18:52.455 }, 00:18:52.455 { 00:18:52.455 "dma_device_id": "system", 00:18:52.455 "dma_device_type": 1 00:18:52.455 }, 00:18:52.455 { 00:18:52.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.455 "dma_device_type": 2 00:18:52.455 } 00:18:52.455 ], 00:18:52.455 "driver_specific": { 00:18:52.455 "raid": { 00:18:52.455 "uuid": "987a5b8b-6d10-461f-ba57-a9f90c70d095", 00:18:52.455 "strip_size_kb": 0, 00:18:52.455 "state": "online", 00:18:52.455 "raid_level": "raid1", 00:18:52.455 "superblock": false, 00:18:52.455 "num_base_bdevs": 4, 00:18:52.455 "num_base_bdevs_discovered": 4, 00:18:52.455 "num_base_bdevs_operational": 4, 00:18:52.455 "base_bdevs_list": [ 00:18:52.455 { 00:18:52.455 "name": "BaseBdev1", 00:18:52.455 "uuid": "083e4811-f8be-4718-9c3d-15bbc06473cb", 00:18:52.455 "is_configured": true, 00:18:52.455 "data_offset": 0, 00:18:52.455 "data_size": 65536 00:18:52.455 }, 00:18:52.455 { 00:18:52.455 "name": "BaseBdev2", 00:18:52.455 "uuid": "6b51255e-7efc-4917-9962-e8a1675e75bb", 00:18:52.455 "is_configured": true, 00:18:52.455 "data_offset": 0, 00:18:52.455 "data_size": 65536 00:18:52.455 }, 00:18:52.455 { 00:18:52.455 "name": "BaseBdev3", 00:18:52.455 "uuid": "10c5e601-ceab-4925-b845-4dcc3645869e", 00:18:52.455 "is_configured": true, 00:18:52.455 "data_offset": 0, 00:18:52.456 "data_size": 65536 00:18:52.456 }, 00:18:52.456 { 00:18:52.456 "name": "BaseBdev4", 00:18:52.456 "uuid": "7ddafd61-eb66-4a3e-98fb-a350100ef1e0", 00:18:52.456 "is_configured": true, 00:18:52.456 "data_offset": 0, 00:18:52.456 "data_size": 65536 00:18:52.456 } 00:18:52.456 ] 00:18:52.456 } 00:18:52.456 } 00:18:52.456 }' 00:18:52.456 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:52.456 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:52.456 BaseBdev2 00:18:52.456 BaseBdev3 00:18:52.456 BaseBdev4' 00:18:52.456 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.456 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:52.456 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.716 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.716 "name": "BaseBdev1", 00:18:52.716 "aliases": [ 00:18:52.716 "083e4811-f8be-4718-9c3d-15bbc06473cb" 00:18:52.716 ], 00:18:52.716 "product_name": "Malloc disk", 00:18:52.716 "block_size": 512, 00:18:52.716 "num_blocks": 65536, 00:18:52.716 "uuid": "083e4811-f8be-4718-9c3d-15bbc06473cb", 00:18:52.716 "assigned_rate_limits": { 00:18:52.716 "rw_ios_per_sec": 0, 00:18:52.716 "rw_mbytes_per_sec": 0, 00:18:52.716 "r_mbytes_per_sec": 0, 00:18:52.716 "w_mbytes_per_sec": 0 00:18:52.716 }, 00:18:52.716 "claimed": true, 00:18:52.716 "claim_type": "exclusive_write", 00:18:52.716 "zoned": false, 00:18:52.716 "supported_io_types": { 00:18:52.716 "read": true, 00:18:52.716 "write": true, 00:18:52.716 "unmap": true, 00:18:52.716 "flush": true, 00:18:52.716 "reset": true, 00:18:52.716 "nvme_admin": false, 00:18:52.716 "nvme_io": false, 00:18:52.716 "nvme_io_md": false, 00:18:52.716 "write_zeroes": true, 00:18:52.716 "zcopy": true, 00:18:52.716 "get_zone_info": false, 00:18:52.716 "zone_management": false, 00:18:52.716 "zone_append": false, 00:18:52.716 "compare": false, 00:18:52.716 "compare_and_write": false, 00:18:52.716 "abort": true, 00:18:52.716 "seek_hole": false, 00:18:52.716 "seek_data": false, 00:18:52.716 "copy": true, 00:18:52.716 "nvme_iov_md": false 00:18:52.716 }, 00:18:52.716 "memory_domains": [ 00:18:52.716 { 00:18:52.716 "dma_device_id": "system", 00:18:52.716 "dma_device_type": 1 00:18:52.716 }, 00:18:52.716 { 00:18:52.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.716 "dma_device_type": 2 00:18:52.716 } 00:18:52.716 ], 00:18:52.716 "driver_specific": {} 00:18:52.716 }' 00:18:52.716 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.716 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.716 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.716 17:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:52.976 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.244 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:53.244 "name": "BaseBdev2", 00:18:53.244 "aliases": [ 00:18:53.244 "6b51255e-7efc-4917-9962-e8a1675e75bb" 00:18:53.244 ], 00:18:53.244 "product_name": "Malloc disk", 00:18:53.244 "block_size": 512, 00:18:53.244 "num_blocks": 65536, 00:18:53.244 "uuid": "6b51255e-7efc-4917-9962-e8a1675e75bb", 00:18:53.244 "assigned_rate_limits": { 00:18:53.244 "rw_ios_per_sec": 0, 00:18:53.244 "rw_mbytes_per_sec": 0, 00:18:53.244 "r_mbytes_per_sec": 0, 00:18:53.244 "w_mbytes_per_sec": 0 00:18:53.244 }, 00:18:53.244 "claimed": true, 00:18:53.244 "claim_type": "exclusive_write", 00:18:53.244 "zoned": false, 00:18:53.244 "supported_io_types": { 00:18:53.244 "read": true, 00:18:53.244 "write": true, 00:18:53.244 "unmap": true, 00:18:53.244 "flush": true, 00:18:53.244 "reset": true, 00:18:53.244 "nvme_admin": false, 00:18:53.244 "nvme_io": false, 00:18:53.244 "nvme_io_md": false, 00:18:53.244 "write_zeroes": true, 00:18:53.244 "zcopy": true, 00:18:53.244 "get_zone_info": false, 00:18:53.244 "zone_management": false, 00:18:53.244 "zone_append": false, 00:18:53.244 "compare": false, 00:18:53.244 "compare_and_write": false, 00:18:53.244 "abort": true, 00:18:53.244 "seek_hole": false, 00:18:53.244 "seek_data": false, 00:18:53.244 "copy": true, 00:18:53.244 "nvme_iov_md": false 00:18:53.244 }, 00:18:53.244 "memory_domains": [ 00:18:53.244 { 00:18:53.244 "dma_device_id": "system", 00:18:53.244 "dma_device_type": 1 00:18:53.244 }, 00:18:53.244 { 00:18:53.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.244 "dma_device_type": 2 00:18:53.244 } 00:18:53.244 ], 00:18:53.244 "driver_specific": {} 00:18:53.244 }' 00:18:53.244 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.244 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.244 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:53.244 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:53.505 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.765 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:53.765 "name": "BaseBdev3", 00:18:53.765 "aliases": [ 00:18:53.765 "10c5e601-ceab-4925-b845-4dcc3645869e" 00:18:53.765 ], 00:18:53.765 "product_name": "Malloc disk", 00:18:53.765 "block_size": 512, 00:18:53.765 "num_blocks": 65536, 00:18:53.765 "uuid": "10c5e601-ceab-4925-b845-4dcc3645869e", 00:18:53.765 "assigned_rate_limits": { 00:18:53.765 "rw_ios_per_sec": 0, 00:18:53.765 "rw_mbytes_per_sec": 0, 00:18:53.765 "r_mbytes_per_sec": 0, 00:18:53.765 "w_mbytes_per_sec": 0 00:18:53.765 }, 00:18:53.765 "claimed": true, 00:18:53.765 "claim_type": "exclusive_write", 00:18:53.765 "zoned": false, 00:18:53.765 "supported_io_types": { 00:18:53.765 "read": true, 00:18:53.765 "write": true, 00:18:53.765 "unmap": true, 00:18:53.765 "flush": true, 00:18:53.765 "reset": true, 00:18:53.765 "nvme_admin": false, 00:18:53.765 "nvme_io": false, 00:18:53.765 "nvme_io_md": false, 00:18:53.765 "write_zeroes": true, 00:18:53.765 "zcopy": true, 00:18:53.765 "get_zone_info": false, 00:18:53.765 "zone_management": false, 00:18:53.765 "zone_append": false, 00:18:53.765 "compare": false, 00:18:53.765 "compare_and_write": false, 00:18:53.765 "abort": true, 00:18:53.765 "seek_hole": false, 00:18:53.765 "seek_data": false, 00:18:53.765 "copy": true, 00:18:53.765 "nvme_iov_md": false 00:18:53.765 }, 00:18:53.765 "memory_domains": [ 00:18:53.765 { 00:18:53.765 "dma_device_id": "system", 00:18:53.765 "dma_device_type": 1 00:18:53.765 }, 00:18:53.765 { 00:18:53.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.765 "dma_device_type": 2 00:18:53.765 } 00:18:53.765 ], 00:18:53.765 "driver_specific": {} 00:18:53.765 }' 00:18:53.765 17:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.765 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.765 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:53.765 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.025 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.025 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.025 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.025 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.025 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.025 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.025 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.286 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:54.286 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:54.286 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:54.286 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.286 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.286 "name": "BaseBdev4", 00:18:54.286 "aliases": [ 00:18:54.286 "7ddafd61-eb66-4a3e-98fb-a350100ef1e0" 00:18:54.286 ], 00:18:54.286 "product_name": "Malloc disk", 00:18:54.286 "block_size": 512, 00:18:54.286 "num_blocks": 65536, 00:18:54.286 "uuid": "7ddafd61-eb66-4a3e-98fb-a350100ef1e0", 00:18:54.286 "assigned_rate_limits": { 00:18:54.286 "rw_ios_per_sec": 0, 00:18:54.286 "rw_mbytes_per_sec": 0, 00:18:54.286 "r_mbytes_per_sec": 0, 00:18:54.286 "w_mbytes_per_sec": 0 00:18:54.286 }, 00:18:54.286 "claimed": true, 00:18:54.286 "claim_type": "exclusive_write", 00:18:54.286 "zoned": false, 00:18:54.286 "supported_io_types": { 00:18:54.286 "read": true, 00:18:54.286 "write": true, 00:18:54.286 "unmap": true, 00:18:54.286 "flush": true, 00:18:54.286 "reset": true, 00:18:54.286 "nvme_admin": false, 00:18:54.286 "nvme_io": false, 00:18:54.286 "nvme_io_md": false, 00:18:54.286 "write_zeroes": true, 00:18:54.286 "zcopy": true, 00:18:54.286 "get_zone_info": false, 00:18:54.286 "zone_management": false, 00:18:54.286 "zone_append": false, 00:18:54.286 "compare": false, 00:18:54.286 "compare_and_write": false, 00:18:54.286 "abort": true, 00:18:54.286 "seek_hole": false, 00:18:54.286 "seek_data": false, 00:18:54.286 "copy": true, 00:18:54.286 "nvme_iov_md": false 00:18:54.286 }, 00:18:54.286 "memory_domains": [ 00:18:54.286 { 00:18:54.286 "dma_device_id": "system", 00:18:54.286 "dma_device_type": 1 00:18:54.286 }, 00:18:54.286 { 00:18:54.286 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.286 "dma_device_type": 2 00:18:54.286 } 00:18:54.286 ], 00:18:54.286 "driver_specific": {} 00:18:54.286 }' 00:18:54.286 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.286 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.546 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.546 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.546 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.546 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.546 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.546 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.546 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.546 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.806 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.806 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:54.806 17:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:54.806 [2024-07-15 17:31:06.064028] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:54.806 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:54.807 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:54.807 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.807 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.807 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.807 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.807 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.807 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:55.067 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:55.067 "name": "Existed_Raid", 00:18:55.067 "uuid": "987a5b8b-6d10-461f-ba57-a9f90c70d095", 00:18:55.067 "strip_size_kb": 0, 00:18:55.067 "state": "online", 00:18:55.067 "raid_level": "raid1", 00:18:55.067 "superblock": false, 00:18:55.067 "num_base_bdevs": 4, 00:18:55.067 "num_base_bdevs_discovered": 3, 00:18:55.067 "num_base_bdevs_operational": 3, 00:18:55.067 "base_bdevs_list": [ 00:18:55.067 { 00:18:55.067 "name": null, 00:18:55.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:55.067 "is_configured": false, 00:18:55.067 "data_offset": 0, 00:18:55.067 "data_size": 65536 00:18:55.067 }, 00:18:55.067 { 00:18:55.067 "name": "BaseBdev2", 00:18:55.067 "uuid": "6b51255e-7efc-4917-9962-e8a1675e75bb", 00:18:55.067 "is_configured": true, 00:18:55.067 "data_offset": 0, 00:18:55.067 "data_size": 65536 00:18:55.067 }, 00:18:55.067 { 00:18:55.067 "name": "BaseBdev3", 00:18:55.067 "uuid": "10c5e601-ceab-4925-b845-4dcc3645869e", 00:18:55.067 "is_configured": true, 00:18:55.067 "data_offset": 0, 00:18:55.067 "data_size": 65536 00:18:55.067 }, 00:18:55.067 { 00:18:55.067 "name": "BaseBdev4", 00:18:55.067 "uuid": "7ddafd61-eb66-4a3e-98fb-a350100ef1e0", 00:18:55.067 "is_configured": true, 00:18:55.067 "data_offset": 0, 00:18:55.067 "data_size": 65536 00:18:55.067 } 00:18:55.067 ] 00:18:55.067 }' 00:18:55.067 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:55.067 17:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:55.637 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:55.637 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:55.637 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.637 17:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:55.898 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:55.898 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:55.898 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:56.157 [2024-07-15 17:31:07.222967] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:56.158 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:56.158 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:56.158 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.158 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:56.158 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:56.158 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:56.158 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:56.417 [2024-07-15 17:31:07.617591] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:56.417 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:56.417 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:56.417 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.417 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:56.677 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:56.677 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:56.677 17:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:56.937 [2024-07-15 17:31:08.000326] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:56.937 [2024-07-15 17:31:08.000386] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:56.937 [2024-07-15 17:31:08.006384] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:56.937 [2024-07-15 17:31:08.006408] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:56.937 [2024-07-15 17:31:08.006414] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f12fc0 name Existed_Raid, state offline 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:56.937 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:57.198 BaseBdev2 00:18:57.198 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:57.198 17:31:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:57.198 17:31:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:57.198 17:31:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:57.198 17:31:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:57.198 17:31:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:57.198 17:31:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:57.458 17:31:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:57.718 [ 00:18:57.718 { 00:18:57.718 "name": "BaseBdev2", 00:18:57.718 "aliases": [ 00:18:57.718 "45448188-31d5-4600-bf3a-87679855a487" 00:18:57.718 ], 00:18:57.718 "product_name": "Malloc disk", 00:18:57.718 "block_size": 512, 00:18:57.718 "num_blocks": 65536, 00:18:57.718 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:18:57.718 "assigned_rate_limits": { 00:18:57.718 "rw_ios_per_sec": 0, 00:18:57.718 "rw_mbytes_per_sec": 0, 00:18:57.718 "r_mbytes_per_sec": 0, 00:18:57.718 "w_mbytes_per_sec": 0 00:18:57.718 }, 00:18:57.718 "claimed": false, 00:18:57.718 "zoned": false, 00:18:57.718 "supported_io_types": { 00:18:57.718 "read": true, 00:18:57.718 "write": true, 00:18:57.718 "unmap": true, 00:18:57.719 "flush": true, 00:18:57.719 "reset": true, 00:18:57.719 "nvme_admin": false, 00:18:57.719 "nvme_io": false, 00:18:57.719 "nvme_io_md": false, 00:18:57.719 "write_zeroes": true, 00:18:57.719 "zcopy": true, 00:18:57.719 "get_zone_info": false, 00:18:57.719 "zone_management": false, 00:18:57.719 "zone_append": false, 00:18:57.719 "compare": false, 00:18:57.719 "compare_and_write": false, 00:18:57.719 "abort": true, 00:18:57.719 "seek_hole": false, 00:18:57.719 "seek_data": false, 00:18:57.719 "copy": true, 00:18:57.719 "nvme_iov_md": false 00:18:57.719 }, 00:18:57.719 "memory_domains": [ 00:18:57.719 { 00:18:57.719 "dma_device_id": "system", 00:18:57.719 "dma_device_type": 1 00:18:57.719 }, 00:18:57.719 { 00:18:57.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.719 "dma_device_type": 2 00:18:57.719 } 00:18:57.719 ], 00:18:57.719 "driver_specific": {} 00:18:57.719 } 00:18:57.719 ] 00:18:57.719 17:31:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:57.719 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:57.719 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:57.719 17:31:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:57.719 BaseBdev3 00:18:57.719 17:31:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:57.719 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:57.719 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:57.719 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:57.719 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:57.719 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:57.719 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:57.978 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:58.238 [ 00:18:58.238 { 00:18:58.238 "name": "BaseBdev3", 00:18:58.238 "aliases": [ 00:18:58.238 "25bb37a0-b26d-43d2-be70-acb4a133bf0f" 00:18:58.238 ], 00:18:58.238 "product_name": "Malloc disk", 00:18:58.238 "block_size": 512, 00:18:58.238 "num_blocks": 65536, 00:18:58.238 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:18:58.238 "assigned_rate_limits": { 00:18:58.238 "rw_ios_per_sec": 0, 00:18:58.238 "rw_mbytes_per_sec": 0, 00:18:58.238 "r_mbytes_per_sec": 0, 00:18:58.238 "w_mbytes_per_sec": 0 00:18:58.238 }, 00:18:58.238 "claimed": false, 00:18:58.238 "zoned": false, 00:18:58.238 "supported_io_types": { 00:18:58.238 "read": true, 00:18:58.238 "write": true, 00:18:58.238 "unmap": true, 00:18:58.238 "flush": true, 00:18:58.238 "reset": true, 00:18:58.238 "nvme_admin": false, 00:18:58.238 "nvme_io": false, 00:18:58.238 "nvme_io_md": false, 00:18:58.238 "write_zeroes": true, 00:18:58.238 "zcopy": true, 00:18:58.238 "get_zone_info": false, 00:18:58.238 "zone_management": false, 00:18:58.238 "zone_append": false, 00:18:58.238 "compare": false, 00:18:58.238 "compare_and_write": false, 00:18:58.238 "abort": true, 00:18:58.238 "seek_hole": false, 00:18:58.238 "seek_data": false, 00:18:58.238 "copy": true, 00:18:58.238 "nvme_iov_md": false 00:18:58.238 }, 00:18:58.238 "memory_domains": [ 00:18:58.238 { 00:18:58.238 "dma_device_id": "system", 00:18:58.238 "dma_device_type": 1 00:18:58.238 }, 00:18:58.238 { 00:18:58.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.238 "dma_device_type": 2 00:18:58.238 } 00:18:58.238 ], 00:18:58.238 "driver_specific": {} 00:18:58.238 } 00:18:58.238 ] 00:18:58.238 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:58.238 17:31:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:58.238 17:31:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:58.238 17:31:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:58.496 BaseBdev4 00:18:58.496 17:31:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:58.496 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:58.496 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:58.496 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:58.496 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:58.496 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:58.496 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:58.496 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:58.755 [ 00:18:58.755 { 00:18:58.755 "name": "BaseBdev4", 00:18:58.755 "aliases": [ 00:18:58.755 "d1972e16-2180-41d3-b7db-e17436299817" 00:18:58.755 ], 00:18:58.755 "product_name": "Malloc disk", 00:18:58.755 "block_size": 512, 00:18:58.755 "num_blocks": 65536, 00:18:58.755 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:18:58.755 "assigned_rate_limits": { 00:18:58.755 "rw_ios_per_sec": 0, 00:18:58.755 "rw_mbytes_per_sec": 0, 00:18:58.755 "r_mbytes_per_sec": 0, 00:18:58.755 "w_mbytes_per_sec": 0 00:18:58.755 }, 00:18:58.755 "claimed": false, 00:18:58.755 "zoned": false, 00:18:58.755 "supported_io_types": { 00:18:58.755 "read": true, 00:18:58.755 "write": true, 00:18:58.755 "unmap": true, 00:18:58.755 "flush": true, 00:18:58.755 "reset": true, 00:18:58.755 "nvme_admin": false, 00:18:58.755 "nvme_io": false, 00:18:58.755 "nvme_io_md": false, 00:18:58.755 "write_zeroes": true, 00:18:58.755 "zcopy": true, 00:18:58.755 "get_zone_info": false, 00:18:58.755 "zone_management": false, 00:18:58.755 "zone_append": false, 00:18:58.755 "compare": false, 00:18:58.755 "compare_and_write": false, 00:18:58.755 "abort": true, 00:18:58.755 "seek_hole": false, 00:18:58.756 "seek_data": false, 00:18:58.756 "copy": true, 00:18:58.756 "nvme_iov_md": false 00:18:58.756 }, 00:18:58.756 "memory_domains": [ 00:18:58.756 { 00:18:58.756 "dma_device_id": "system", 00:18:58.756 "dma_device_type": 1 00:18:58.756 }, 00:18:58.756 { 00:18:58.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.756 "dma_device_type": 2 00:18:58.756 } 00:18:58.756 ], 00:18:58.756 "driver_specific": {} 00:18:58.756 } 00:18:58.756 ] 00:18:58.756 17:31:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:58.756 17:31:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:58.756 17:31:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:58.756 17:31:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:59.015 [2024-07-15 17:31:10.127512] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:59.015 [2024-07-15 17:31:10.127544] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:59.015 [2024-07-15 17:31:10.127557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:59.015 [2024-07-15 17:31:10.128596] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:59.015 [2024-07-15 17:31:10.128628] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.015 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.585 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.585 "name": "Existed_Raid", 00:18:59.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.585 "strip_size_kb": 0, 00:18:59.585 "state": "configuring", 00:18:59.585 "raid_level": "raid1", 00:18:59.585 "superblock": false, 00:18:59.585 "num_base_bdevs": 4, 00:18:59.585 "num_base_bdevs_discovered": 3, 00:18:59.585 "num_base_bdevs_operational": 4, 00:18:59.585 "base_bdevs_list": [ 00:18:59.585 { 00:18:59.585 "name": "BaseBdev1", 00:18:59.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.585 "is_configured": false, 00:18:59.585 "data_offset": 0, 00:18:59.585 "data_size": 0 00:18:59.585 }, 00:18:59.585 { 00:18:59.585 "name": "BaseBdev2", 00:18:59.585 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:18:59.585 "is_configured": true, 00:18:59.585 "data_offset": 0, 00:18:59.585 "data_size": 65536 00:18:59.585 }, 00:18:59.585 { 00:18:59.585 "name": "BaseBdev3", 00:18:59.585 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:18:59.585 "is_configured": true, 00:18:59.585 "data_offset": 0, 00:18:59.585 "data_size": 65536 00:18:59.585 }, 00:18:59.585 { 00:18:59.585 "name": "BaseBdev4", 00:18:59.585 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:18:59.585 "is_configured": true, 00:18:59.585 "data_offset": 0, 00:18:59.585 "data_size": 65536 00:18:59.585 } 00:18:59.585 ] 00:18:59.585 }' 00:18:59.585 17:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.585 17:31:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:00.210 [2024-07-15 17:31:11.394691] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.210 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:00.483 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:00.483 "name": "Existed_Raid", 00:19:00.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:00.483 "strip_size_kb": 0, 00:19:00.483 "state": "configuring", 00:19:00.483 "raid_level": "raid1", 00:19:00.483 "superblock": false, 00:19:00.483 "num_base_bdevs": 4, 00:19:00.483 "num_base_bdevs_discovered": 2, 00:19:00.483 "num_base_bdevs_operational": 4, 00:19:00.483 "base_bdevs_list": [ 00:19:00.483 { 00:19:00.483 "name": "BaseBdev1", 00:19:00.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:00.483 "is_configured": false, 00:19:00.483 "data_offset": 0, 00:19:00.483 "data_size": 0 00:19:00.483 }, 00:19:00.483 { 00:19:00.483 "name": null, 00:19:00.483 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:19:00.483 "is_configured": false, 00:19:00.483 "data_offset": 0, 00:19:00.483 "data_size": 65536 00:19:00.483 }, 00:19:00.483 { 00:19:00.483 "name": "BaseBdev3", 00:19:00.483 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:19:00.483 "is_configured": true, 00:19:00.483 "data_offset": 0, 00:19:00.483 "data_size": 65536 00:19:00.483 }, 00:19:00.483 { 00:19:00.483 "name": "BaseBdev4", 00:19:00.483 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:19:00.483 "is_configured": true, 00:19:00.483 "data_offset": 0, 00:19:00.483 "data_size": 65536 00:19:00.483 } 00:19:00.483 ] 00:19:00.483 }' 00:19:00.483 17:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:00.483 17:31:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:01.054 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.054 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:01.314 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:01.314 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:01.314 [2024-07-15 17:31:12.558659] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:01.314 BaseBdev1 00:19:01.314 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:01.314 17:31:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:01.314 17:31:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:01.314 17:31:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:01.314 17:31:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:01.314 17:31:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:01.314 17:31:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:01.574 17:31:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:01.834 [ 00:19:01.834 { 00:19:01.834 "name": "BaseBdev1", 00:19:01.834 "aliases": [ 00:19:01.834 "d4099ba4-3a70-4f19-b71f-78165805f2e1" 00:19:01.834 ], 00:19:01.834 "product_name": "Malloc disk", 00:19:01.834 "block_size": 512, 00:19:01.834 "num_blocks": 65536, 00:19:01.835 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:01.835 "assigned_rate_limits": { 00:19:01.835 "rw_ios_per_sec": 0, 00:19:01.835 "rw_mbytes_per_sec": 0, 00:19:01.835 "r_mbytes_per_sec": 0, 00:19:01.835 "w_mbytes_per_sec": 0 00:19:01.835 }, 00:19:01.835 "claimed": true, 00:19:01.835 "claim_type": "exclusive_write", 00:19:01.835 "zoned": false, 00:19:01.835 "supported_io_types": { 00:19:01.835 "read": true, 00:19:01.835 "write": true, 00:19:01.835 "unmap": true, 00:19:01.835 "flush": true, 00:19:01.835 "reset": true, 00:19:01.835 "nvme_admin": false, 00:19:01.835 "nvme_io": false, 00:19:01.835 "nvme_io_md": false, 00:19:01.835 "write_zeroes": true, 00:19:01.835 "zcopy": true, 00:19:01.835 "get_zone_info": false, 00:19:01.835 "zone_management": false, 00:19:01.835 "zone_append": false, 00:19:01.835 "compare": false, 00:19:01.835 "compare_and_write": false, 00:19:01.835 "abort": true, 00:19:01.835 "seek_hole": false, 00:19:01.835 "seek_data": false, 00:19:01.835 "copy": true, 00:19:01.835 "nvme_iov_md": false 00:19:01.835 }, 00:19:01.835 "memory_domains": [ 00:19:01.835 { 00:19:01.835 "dma_device_id": "system", 00:19:01.835 "dma_device_type": 1 00:19:01.835 }, 00:19:01.835 { 00:19:01.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.835 "dma_device_type": 2 00:19:01.835 } 00:19:01.835 ], 00:19:01.835 "driver_specific": {} 00:19:01.835 } 00:19:01.835 ] 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.835 17:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.835 17:31:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.835 "name": "Existed_Raid", 00:19:01.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.835 "strip_size_kb": 0, 00:19:01.835 "state": "configuring", 00:19:01.835 "raid_level": "raid1", 00:19:01.835 "superblock": false, 00:19:01.835 "num_base_bdevs": 4, 00:19:01.835 "num_base_bdevs_discovered": 3, 00:19:01.835 "num_base_bdevs_operational": 4, 00:19:01.835 "base_bdevs_list": [ 00:19:01.835 { 00:19:01.835 "name": "BaseBdev1", 00:19:01.835 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:01.835 "is_configured": true, 00:19:01.835 "data_offset": 0, 00:19:01.835 "data_size": 65536 00:19:01.835 }, 00:19:01.835 { 00:19:01.835 "name": null, 00:19:01.835 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:19:01.835 "is_configured": false, 00:19:01.835 "data_offset": 0, 00:19:01.835 "data_size": 65536 00:19:01.835 }, 00:19:01.835 { 00:19:01.835 "name": "BaseBdev3", 00:19:01.835 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:19:01.835 "is_configured": true, 00:19:01.835 "data_offset": 0, 00:19:01.835 "data_size": 65536 00:19:01.835 }, 00:19:01.835 { 00:19:01.835 "name": "BaseBdev4", 00:19:01.835 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:19:01.835 "is_configured": true, 00:19:01.835 "data_offset": 0, 00:19:01.835 "data_size": 65536 00:19:01.835 } 00:19:01.835 ] 00:19:01.835 }' 00:19:01.835 17:31:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.096 17:31:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:02.665 17:31:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.665 17:31:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:02.665 17:31:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:02.665 17:31:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:02.926 [2024-07-15 17:31:14.086534] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.926 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:03.186 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.186 "name": "Existed_Raid", 00:19:03.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.186 "strip_size_kb": 0, 00:19:03.186 "state": "configuring", 00:19:03.186 "raid_level": "raid1", 00:19:03.186 "superblock": false, 00:19:03.186 "num_base_bdevs": 4, 00:19:03.186 "num_base_bdevs_discovered": 2, 00:19:03.186 "num_base_bdevs_operational": 4, 00:19:03.186 "base_bdevs_list": [ 00:19:03.186 { 00:19:03.186 "name": "BaseBdev1", 00:19:03.186 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:03.186 "is_configured": true, 00:19:03.186 "data_offset": 0, 00:19:03.186 "data_size": 65536 00:19:03.186 }, 00:19:03.186 { 00:19:03.186 "name": null, 00:19:03.186 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:19:03.186 "is_configured": false, 00:19:03.186 "data_offset": 0, 00:19:03.186 "data_size": 65536 00:19:03.186 }, 00:19:03.186 { 00:19:03.186 "name": null, 00:19:03.186 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:19:03.186 "is_configured": false, 00:19:03.186 "data_offset": 0, 00:19:03.186 "data_size": 65536 00:19:03.186 }, 00:19:03.186 { 00:19:03.186 "name": "BaseBdev4", 00:19:03.186 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:19:03.186 "is_configured": true, 00:19:03.186 "data_offset": 0, 00:19:03.186 "data_size": 65536 00:19:03.186 } 00:19:03.186 ] 00:19:03.186 }' 00:19:03.186 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.186 17:31:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:03.757 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:03.757 17:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:04.017 [2024-07-15 17:31:15.233456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.017 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.278 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.278 "name": "Existed_Raid", 00:19:04.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.278 "strip_size_kb": 0, 00:19:04.278 "state": "configuring", 00:19:04.278 "raid_level": "raid1", 00:19:04.278 "superblock": false, 00:19:04.278 "num_base_bdevs": 4, 00:19:04.278 "num_base_bdevs_discovered": 3, 00:19:04.278 "num_base_bdevs_operational": 4, 00:19:04.278 "base_bdevs_list": [ 00:19:04.278 { 00:19:04.278 "name": "BaseBdev1", 00:19:04.278 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:04.278 "is_configured": true, 00:19:04.278 "data_offset": 0, 00:19:04.278 "data_size": 65536 00:19:04.278 }, 00:19:04.278 { 00:19:04.278 "name": null, 00:19:04.278 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:19:04.278 "is_configured": false, 00:19:04.278 "data_offset": 0, 00:19:04.278 "data_size": 65536 00:19:04.278 }, 00:19:04.278 { 00:19:04.278 "name": "BaseBdev3", 00:19:04.278 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:19:04.278 "is_configured": true, 00:19:04.278 "data_offset": 0, 00:19:04.278 "data_size": 65536 00:19:04.278 }, 00:19:04.278 { 00:19:04.278 "name": "BaseBdev4", 00:19:04.278 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:19:04.278 "is_configured": true, 00:19:04.278 "data_offset": 0, 00:19:04.278 "data_size": 65536 00:19:04.278 } 00:19:04.278 ] 00:19:04.278 }' 00:19:04.278 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.278 17:31:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.849 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.849 17:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:05.109 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:05.109 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:05.109 [2024-07-15 17:31:16.368338] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.110 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:05.370 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.370 "name": "Existed_Raid", 00:19:05.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.370 "strip_size_kb": 0, 00:19:05.370 "state": "configuring", 00:19:05.370 "raid_level": "raid1", 00:19:05.370 "superblock": false, 00:19:05.370 "num_base_bdevs": 4, 00:19:05.370 "num_base_bdevs_discovered": 2, 00:19:05.370 "num_base_bdevs_operational": 4, 00:19:05.370 "base_bdevs_list": [ 00:19:05.370 { 00:19:05.370 "name": null, 00:19:05.370 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:05.370 "is_configured": false, 00:19:05.370 "data_offset": 0, 00:19:05.370 "data_size": 65536 00:19:05.370 }, 00:19:05.370 { 00:19:05.370 "name": null, 00:19:05.370 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:19:05.370 "is_configured": false, 00:19:05.370 "data_offset": 0, 00:19:05.370 "data_size": 65536 00:19:05.370 }, 00:19:05.370 { 00:19:05.370 "name": "BaseBdev3", 00:19:05.370 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:19:05.370 "is_configured": true, 00:19:05.370 "data_offset": 0, 00:19:05.370 "data_size": 65536 00:19:05.370 }, 00:19:05.370 { 00:19:05.370 "name": "BaseBdev4", 00:19:05.370 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:19:05.370 "is_configured": true, 00:19:05.370 "data_offset": 0, 00:19:05.370 "data_size": 65536 00:19:05.370 } 00:19:05.370 ] 00:19:05.370 }' 00:19:05.370 17:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.370 17:31:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.940 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.940 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:06.201 [2024-07-15 17:31:17.473007] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.201 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:06.461 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.461 "name": "Existed_Raid", 00:19:06.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.461 "strip_size_kb": 0, 00:19:06.461 "state": "configuring", 00:19:06.461 "raid_level": "raid1", 00:19:06.461 "superblock": false, 00:19:06.461 "num_base_bdevs": 4, 00:19:06.461 "num_base_bdevs_discovered": 3, 00:19:06.461 "num_base_bdevs_operational": 4, 00:19:06.461 "base_bdevs_list": [ 00:19:06.461 { 00:19:06.461 "name": null, 00:19:06.461 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:06.461 "is_configured": false, 00:19:06.461 "data_offset": 0, 00:19:06.461 "data_size": 65536 00:19:06.461 }, 00:19:06.461 { 00:19:06.461 "name": "BaseBdev2", 00:19:06.461 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:19:06.461 "is_configured": true, 00:19:06.461 "data_offset": 0, 00:19:06.461 "data_size": 65536 00:19:06.461 }, 00:19:06.461 { 00:19:06.461 "name": "BaseBdev3", 00:19:06.461 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:19:06.461 "is_configured": true, 00:19:06.461 "data_offset": 0, 00:19:06.461 "data_size": 65536 00:19:06.461 }, 00:19:06.461 { 00:19:06.461 "name": "BaseBdev4", 00:19:06.461 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:19:06.461 "is_configured": true, 00:19:06.461 "data_offset": 0, 00:19:06.461 "data_size": 65536 00:19:06.461 } 00:19:06.461 ] 00:19:06.461 }' 00:19:06.461 17:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.461 17:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:07.030 17:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.030 17:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:07.289 17:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:07.289 17:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.289 17:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:07.549 17:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d4099ba4-3a70-4f19-b71f-78165805f2e1 00:19:07.549 [2024-07-15 17:31:18.781141] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:07.549 [2024-07-15 17:31:18.781166] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f1a1a0 00:19:07.549 [2024-07-15 17:31:18.781171] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:07.549 [2024-07-15 17:31:18.781318] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f195e0 00:19:07.549 [2024-07-15 17:31:18.781416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f1a1a0 00:19:07.549 [2024-07-15 17:31:18.781422] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f1a1a0 00:19:07.549 [2024-07-15 17:31:18.781535] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:07.549 NewBaseBdev 00:19:07.549 17:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:07.549 17:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:07.549 17:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:07.549 17:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:07.549 17:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:07.549 17:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:07.549 17:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:07.811 17:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:08.071 [ 00:19:08.071 { 00:19:08.071 "name": "NewBaseBdev", 00:19:08.071 "aliases": [ 00:19:08.071 "d4099ba4-3a70-4f19-b71f-78165805f2e1" 00:19:08.071 ], 00:19:08.071 "product_name": "Malloc disk", 00:19:08.071 "block_size": 512, 00:19:08.071 "num_blocks": 65536, 00:19:08.071 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:08.071 "assigned_rate_limits": { 00:19:08.071 "rw_ios_per_sec": 0, 00:19:08.071 "rw_mbytes_per_sec": 0, 00:19:08.071 "r_mbytes_per_sec": 0, 00:19:08.071 "w_mbytes_per_sec": 0 00:19:08.071 }, 00:19:08.071 "claimed": true, 00:19:08.071 "claim_type": "exclusive_write", 00:19:08.071 "zoned": false, 00:19:08.071 "supported_io_types": { 00:19:08.071 "read": true, 00:19:08.071 "write": true, 00:19:08.071 "unmap": true, 00:19:08.071 "flush": true, 00:19:08.071 "reset": true, 00:19:08.071 "nvme_admin": false, 00:19:08.071 "nvme_io": false, 00:19:08.071 "nvme_io_md": false, 00:19:08.071 "write_zeroes": true, 00:19:08.071 "zcopy": true, 00:19:08.071 "get_zone_info": false, 00:19:08.071 "zone_management": false, 00:19:08.071 "zone_append": false, 00:19:08.071 "compare": false, 00:19:08.071 "compare_and_write": false, 00:19:08.071 "abort": true, 00:19:08.071 "seek_hole": false, 00:19:08.071 "seek_data": false, 00:19:08.071 "copy": true, 00:19:08.071 "nvme_iov_md": false 00:19:08.071 }, 00:19:08.071 "memory_domains": [ 00:19:08.071 { 00:19:08.071 "dma_device_id": "system", 00:19:08.071 "dma_device_type": 1 00:19:08.071 }, 00:19:08.071 { 00:19:08.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.071 "dma_device_type": 2 00:19:08.071 } 00:19:08.071 ], 00:19:08.071 "driver_specific": {} 00:19:08.071 } 00:19:08.071 ] 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.071 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:08.331 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.331 "name": "Existed_Raid", 00:19:08.331 "uuid": "7d926878-6bd1-476a-856e-5c784e4e3c60", 00:19:08.332 "strip_size_kb": 0, 00:19:08.332 "state": "online", 00:19:08.332 "raid_level": "raid1", 00:19:08.332 "superblock": false, 00:19:08.332 "num_base_bdevs": 4, 00:19:08.332 "num_base_bdevs_discovered": 4, 00:19:08.332 "num_base_bdevs_operational": 4, 00:19:08.332 "base_bdevs_list": [ 00:19:08.332 { 00:19:08.332 "name": "NewBaseBdev", 00:19:08.332 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:08.332 "is_configured": true, 00:19:08.332 "data_offset": 0, 00:19:08.332 "data_size": 65536 00:19:08.332 }, 00:19:08.332 { 00:19:08.332 "name": "BaseBdev2", 00:19:08.332 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:19:08.332 "is_configured": true, 00:19:08.332 "data_offset": 0, 00:19:08.332 "data_size": 65536 00:19:08.332 }, 00:19:08.332 { 00:19:08.332 "name": "BaseBdev3", 00:19:08.332 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:19:08.332 "is_configured": true, 00:19:08.332 "data_offset": 0, 00:19:08.332 "data_size": 65536 00:19:08.332 }, 00:19:08.332 { 00:19:08.332 "name": "BaseBdev4", 00:19:08.332 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:19:08.332 "is_configured": true, 00:19:08.332 "data_offset": 0, 00:19:08.332 "data_size": 65536 00:19:08.332 } 00:19:08.332 ] 00:19:08.332 }' 00:19:08.332 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.332 17:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.901 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:08.901 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:08.901 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:08.901 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:08.901 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:08.901 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:08.901 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:08.901 17:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:08.901 [2024-07-15 17:31:20.080704] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:08.901 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:08.901 "name": "Existed_Raid", 00:19:08.901 "aliases": [ 00:19:08.901 "7d926878-6bd1-476a-856e-5c784e4e3c60" 00:19:08.901 ], 00:19:08.901 "product_name": "Raid Volume", 00:19:08.901 "block_size": 512, 00:19:08.901 "num_blocks": 65536, 00:19:08.901 "uuid": "7d926878-6bd1-476a-856e-5c784e4e3c60", 00:19:08.901 "assigned_rate_limits": { 00:19:08.901 "rw_ios_per_sec": 0, 00:19:08.901 "rw_mbytes_per_sec": 0, 00:19:08.901 "r_mbytes_per_sec": 0, 00:19:08.901 "w_mbytes_per_sec": 0 00:19:08.901 }, 00:19:08.901 "claimed": false, 00:19:08.901 "zoned": false, 00:19:08.901 "supported_io_types": { 00:19:08.901 "read": true, 00:19:08.901 "write": true, 00:19:08.901 "unmap": false, 00:19:08.901 "flush": false, 00:19:08.901 "reset": true, 00:19:08.901 "nvme_admin": false, 00:19:08.901 "nvme_io": false, 00:19:08.901 "nvme_io_md": false, 00:19:08.901 "write_zeroes": true, 00:19:08.901 "zcopy": false, 00:19:08.901 "get_zone_info": false, 00:19:08.901 "zone_management": false, 00:19:08.901 "zone_append": false, 00:19:08.901 "compare": false, 00:19:08.901 "compare_and_write": false, 00:19:08.901 "abort": false, 00:19:08.901 "seek_hole": false, 00:19:08.901 "seek_data": false, 00:19:08.901 "copy": false, 00:19:08.901 "nvme_iov_md": false 00:19:08.901 }, 00:19:08.901 "memory_domains": [ 00:19:08.901 { 00:19:08.901 "dma_device_id": "system", 00:19:08.901 "dma_device_type": 1 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.902 "dma_device_type": 2 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "dma_device_id": "system", 00:19:08.902 "dma_device_type": 1 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.902 "dma_device_type": 2 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "dma_device_id": "system", 00:19:08.902 "dma_device_type": 1 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.902 "dma_device_type": 2 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "dma_device_id": "system", 00:19:08.902 "dma_device_type": 1 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.902 "dma_device_type": 2 00:19:08.902 } 00:19:08.902 ], 00:19:08.902 "driver_specific": { 00:19:08.902 "raid": { 00:19:08.902 "uuid": "7d926878-6bd1-476a-856e-5c784e4e3c60", 00:19:08.902 "strip_size_kb": 0, 00:19:08.902 "state": "online", 00:19:08.902 "raid_level": "raid1", 00:19:08.902 "superblock": false, 00:19:08.902 "num_base_bdevs": 4, 00:19:08.902 "num_base_bdevs_discovered": 4, 00:19:08.902 "num_base_bdevs_operational": 4, 00:19:08.902 "base_bdevs_list": [ 00:19:08.902 { 00:19:08.902 "name": "NewBaseBdev", 00:19:08.902 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:08.902 "is_configured": true, 00:19:08.902 "data_offset": 0, 00:19:08.902 "data_size": 65536 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "name": "BaseBdev2", 00:19:08.902 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:19:08.902 "is_configured": true, 00:19:08.902 "data_offset": 0, 00:19:08.902 "data_size": 65536 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "name": "BaseBdev3", 00:19:08.902 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:19:08.902 "is_configured": true, 00:19:08.902 "data_offset": 0, 00:19:08.902 "data_size": 65536 00:19:08.902 }, 00:19:08.902 { 00:19:08.902 "name": "BaseBdev4", 00:19:08.902 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:19:08.902 "is_configured": true, 00:19:08.902 "data_offset": 0, 00:19:08.902 "data_size": 65536 00:19:08.902 } 00:19:08.902 ] 00:19:08.902 } 00:19:08.902 } 00:19:08.902 }' 00:19:08.902 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:08.902 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:08.902 BaseBdev2 00:19:08.902 BaseBdev3 00:19:08.902 BaseBdev4' 00:19:08.902 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:08.902 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:08.902 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:09.163 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:09.163 "name": "NewBaseBdev", 00:19:09.163 "aliases": [ 00:19:09.163 "d4099ba4-3a70-4f19-b71f-78165805f2e1" 00:19:09.163 ], 00:19:09.163 "product_name": "Malloc disk", 00:19:09.163 "block_size": 512, 00:19:09.163 "num_blocks": 65536, 00:19:09.163 "uuid": "d4099ba4-3a70-4f19-b71f-78165805f2e1", 00:19:09.163 "assigned_rate_limits": { 00:19:09.163 "rw_ios_per_sec": 0, 00:19:09.163 "rw_mbytes_per_sec": 0, 00:19:09.163 "r_mbytes_per_sec": 0, 00:19:09.163 "w_mbytes_per_sec": 0 00:19:09.163 }, 00:19:09.163 "claimed": true, 00:19:09.163 "claim_type": "exclusive_write", 00:19:09.163 "zoned": false, 00:19:09.163 "supported_io_types": { 00:19:09.163 "read": true, 00:19:09.163 "write": true, 00:19:09.163 "unmap": true, 00:19:09.163 "flush": true, 00:19:09.163 "reset": true, 00:19:09.163 "nvme_admin": false, 00:19:09.163 "nvme_io": false, 00:19:09.163 "nvme_io_md": false, 00:19:09.163 "write_zeroes": true, 00:19:09.163 "zcopy": true, 00:19:09.163 "get_zone_info": false, 00:19:09.163 "zone_management": false, 00:19:09.163 "zone_append": false, 00:19:09.163 "compare": false, 00:19:09.163 "compare_and_write": false, 00:19:09.163 "abort": true, 00:19:09.163 "seek_hole": false, 00:19:09.163 "seek_data": false, 00:19:09.163 "copy": true, 00:19:09.163 "nvme_iov_md": false 00:19:09.163 }, 00:19:09.163 "memory_domains": [ 00:19:09.163 { 00:19:09.163 "dma_device_id": "system", 00:19:09.163 "dma_device_type": 1 00:19:09.163 }, 00:19:09.163 { 00:19:09.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.163 "dma_device_type": 2 00:19:09.163 } 00:19:09.163 ], 00:19:09.163 "driver_specific": {} 00:19:09.163 }' 00:19:09.163 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.163 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.163 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:09.163 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:09.422 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:09.682 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:09.682 "name": "BaseBdev2", 00:19:09.682 "aliases": [ 00:19:09.682 "45448188-31d5-4600-bf3a-87679855a487" 00:19:09.682 ], 00:19:09.682 "product_name": "Malloc disk", 00:19:09.682 "block_size": 512, 00:19:09.682 "num_blocks": 65536, 00:19:09.682 "uuid": "45448188-31d5-4600-bf3a-87679855a487", 00:19:09.682 "assigned_rate_limits": { 00:19:09.682 "rw_ios_per_sec": 0, 00:19:09.682 "rw_mbytes_per_sec": 0, 00:19:09.682 "r_mbytes_per_sec": 0, 00:19:09.682 "w_mbytes_per_sec": 0 00:19:09.682 }, 00:19:09.682 "claimed": true, 00:19:09.682 "claim_type": "exclusive_write", 00:19:09.682 "zoned": false, 00:19:09.682 "supported_io_types": { 00:19:09.682 "read": true, 00:19:09.682 "write": true, 00:19:09.682 "unmap": true, 00:19:09.682 "flush": true, 00:19:09.682 "reset": true, 00:19:09.682 "nvme_admin": false, 00:19:09.682 "nvme_io": false, 00:19:09.682 "nvme_io_md": false, 00:19:09.682 "write_zeroes": true, 00:19:09.682 "zcopy": true, 00:19:09.682 "get_zone_info": false, 00:19:09.682 "zone_management": false, 00:19:09.682 "zone_append": false, 00:19:09.682 "compare": false, 00:19:09.682 "compare_and_write": false, 00:19:09.682 "abort": true, 00:19:09.682 "seek_hole": false, 00:19:09.682 "seek_data": false, 00:19:09.682 "copy": true, 00:19:09.682 "nvme_iov_md": false 00:19:09.682 }, 00:19:09.682 "memory_domains": [ 00:19:09.682 { 00:19:09.682 "dma_device_id": "system", 00:19:09.682 "dma_device_type": 1 00:19:09.682 }, 00:19:09.682 { 00:19:09.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.682 "dma_device_type": 2 00:19:09.682 } 00:19:09.682 ], 00:19:09.682 "driver_specific": {} 00:19:09.682 }' 00:19:09.682 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.682 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.682 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:09.682 17:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.942 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.942 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:09.942 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.942 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.942 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:09.942 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:09.942 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.203 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:10.203 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:10.203 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:10.203 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:10.203 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:10.203 "name": "BaseBdev3", 00:19:10.203 "aliases": [ 00:19:10.203 "25bb37a0-b26d-43d2-be70-acb4a133bf0f" 00:19:10.203 ], 00:19:10.203 "product_name": "Malloc disk", 00:19:10.203 "block_size": 512, 00:19:10.203 "num_blocks": 65536, 00:19:10.203 "uuid": "25bb37a0-b26d-43d2-be70-acb4a133bf0f", 00:19:10.203 "assigned_rate_limits": { 00:19:10.203 "rw_ios_per_sec": 0, 00:19:10.203 "rw_mbytes_per_sec": 0, 00:19:10.203 "r_mbytes_per_sec": 0, 00:19:10.203 "w_mbytes_per_sec": 0 00:19:10.203 }, 00:19:10.203 "claimed": true, 00:19:10.203 "claim_type": "exclusive_write", 00:19:10.203 "zoned": false, 00:19:10.203 "supported_io_types": { 00:19:10.203 "read": true, 00:19:10.203 "write": true, 00:19:10.203 "unmap": true, 00:19:10.203 "flush": true, 00:19:10.203 "reset": true, 00:19:10.203 "nvme_admin": false, 00:19:10.203 "nvme_io": false, 00:19:10.203 "nvme_io_md": false, 00:19:10.203 "write_zeroes": true, 00:19:10.203 "zcopy": true, 00:19:10.203 "get_zone_info": false, 00:19:10.203 "zone_management": false, 00:19:10.203 "zone_append": false, 00:19:10.203 "compare": false, 00:19:10.203 "compare_and_write": false, 00:19:10.203 "abort": true, 00:19:10.203 "seek_hole": false, 00:19:10.203 "seek_data": false, 00:19:10.203 "copy": true, 00:19:10.203 "nvme_iov_md": false 00:19:10.203 }, 00:19:10.203 "memory_domains": [ 00:19:10.203 { 00:19:10.203 "dma_device_id": "system", 00:19:10.203 "dma_device_type": 1 00:19:10.203 }, 00:19:10.203 { 00:19:10.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.203 "dma_device_type": 2 00:19:10.203 } 00:19:10.203 ], 00:19:10.203 "driver_specific": {} 00:19:10.203 }' 00:19:10.203 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.203 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.465 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:10.465 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.465 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.465 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:10.465 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.465 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.465 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:10.465 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.465 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.726 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:10.726 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:10.726 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:10.726 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:10.726 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:10.726 "name": "BaseBdev4", 00:19:10.726 "aliases": [ 00:19:10.726 "d1972e16-2180-41d3-b7db-e17436299817" 00:19:10.726 ], 00:19:10.726 "product_name": "Malloc disk", 00:19:10.726 "block_size": 512, 00:19:10.726 "num_blocks": 65536, 00:19:10.726 "uuid": "d1972e16-2180-41d3-b7db-e17436299817", 00:19:10.726 "assigned_rate_limits": { 00:19:10.726 "rw_ios_per_sec": 0, 00:19:10.726 "rw_mbytes_per_sec": 0, 00:19:10.726 "r_mbytes_per_sec": 0, 00:19:10.726 "w_mbytes_per_sec": 0 00:19:10.726 }, 00:19:10.726 "claimed": true, 00:19:10.726 "claim_type": "exclusive_write", 00:19:10.726 "zoned": false, 00:19:10.726 "supported_io_types": { 00:19:10.726 "read": true, 00:19:10.726 "write": true, 00:19:10.726 "unmap": true, 00:19:10.726 "flush": true, 00:19:10.726 "reset": true, 00:19:10.726 "nvme_admin": false, 00:19:10.726 "nvme_io": false, 00:19:10.726 "nvme_io_md": false, 00:19:10.727 "write_zeroes": true, 00:19:10.727 "zcopy": true, 00:19:10.727 "get_zone_info": false, 00:19:10.727 "zone_management": false, 00:19:10.727 "zone_append": false, 00:19:10.727 "compare": false, 00:19:10.727 "compare_and_write": false, 00:19:10.727 "abort": true, 00:19:10.727 "seek_hole": false, 00:19:10.727 "seek_data": false, 00:19:10.727 "copy": true, 00:19:10.727 "nvme_iov_md": false 00:19:10.727 }, 00:19:10.727 "memory_domains": [ 00:19:10.727 { 00:19:10.727 "dma_device_id": "system", 00:19:10.727 "dma_device_type": 1 00:19:10.727 }, 00:19:10.727 { 00:19:10.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.727 "dma_device_type": 2 00:19:10.727 } 00:19:10.727 ], 00:19:10.727 "driver_specific": {} 00:19:10.727 }' 00:19:10.727 17:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.987 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.987 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:10.987 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.987 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.987 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:10.987 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.987 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.987 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:10.987 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.247 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.247 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:11.247 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:11.247 [2024-07-15 17:31:22.506582] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:11.247 [2024-07-15 17:31:22.506601] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:11.247 [2024-07-15 17:31:22.506640] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:11.248 [2024-07-15 17:31:22.506850] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:11.248 [2024-07-15 17:31:22.506859] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f1a1a0 name Existed_Raid, state offline 00:19:11.248 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2838372 00:19:11.248 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2838372 ']' 00:19:11.248 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2838372 00:19:11.248 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:11.248 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:11.248 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2838372 00:19:11.508 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:11.508 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:11.508 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2838372' 00:19:11.509 killing process with pid 2838372 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2838372 00:19:11.509 [2024-07-15 17:31:22.576322] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2838372 00:19:11.509 [2024-07-15 17:31:22.596681] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:11.509 00:19:11.509 real 0m27.478s 00:19:11.509 user 0m52.014s 00:19:11.509 sys 0m4.062s 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.509 ************************************ 00:19:11.509 END TEST raid_state_function_test 00:19:11.509 ************************************ 00:19:11.509 17:31:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:11.509 17:31:22 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:19:11.509 17:31:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:11.509 17:31:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:11.509 17:31:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:11.509 ************************************ 00:19:11.509 START TEST raid_state_function_test_sb 00:19:11.509 ************************************ 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:11.509 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2843582 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2843582' 00:19:11.769 Process raid pid: 2843582 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2843582 /var/tmp/spdk-raid.sock 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2843582 ']' 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:11.769 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:11.769 17:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:11.769 [2024-07-15 17:31:22.860601] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:19:11.769 [2024-07-15 17:31:22.860659] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:11.769 [2024-07-15 17:31:22.953885] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:11.769 [2024-07-15 17:31:23.021469] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:11.769 [2024-07-15 17:31:23.066691] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:11.769 [2024-07-15 17:31:23.066730] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:12.709 17:31:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:12.709 17:31:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:12.709 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:12.709 [2024-07-15 17:31:23.870293] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:12.709 [2024-07-15 17:31:23.870324] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:12.709 [2024-07-15 17:31:23.870330] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:12.709 [2024-07-15 17:31:23.870336] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:12.710 [2024-07-15 17:31:23.870340] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:12.710 [2024-07-15 17:31:23.870349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:12.710 [2024-07-15 17:31:23.870354] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:12.710 [2024-07-15 17:31:23.870359] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.710 17:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:12.972 17:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:12.972 "name": "Existed_Raid", 00:19:12.972 "uuid": "174c81f0-a813-459b-b210-56f94ffb5b16", 00:19:12.972 "strip_size_kb": 0, 00:19:12.972 "state": "configuring", 00:19:12.972 "raid_level": "raid1", 00:19:12.972 "superblock": true, 00:19:12.972 "num_base_bdevs": 4, 00:19:12.972 "num_base_bdevs_discovered": 0, 00:19:12.972 "num_base_bdevs_operational": 4, 00:19:12.972 "base_bdevs_list": [ 00:19:12.972 { 00:19:12.972 "name": "BaseBdev1", 00:19:12.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:12.972 "is_configured": false, 00:19:12.972 "data_offset": 0, 00:19:12.972 "data_size": 0 00:19:12.972 }, 00:19:12.972 { 00:19:12.972 "name": "BaseBdev2", 00:19:12.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:12.972 "is_configured": false, 00:19:12.972 "data_offset": 0, 00:19:12.972 "data_size": 0 00:19:12.972 }, 00:19:12.972 { 00:19:12.972 "name": "BaseBdev3", 00:19:12.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:12.972 "is_configured": false, 00:19:12.972 "data_offset": 0, 00:19:12.972 "data_size": 0 00:19:12.972 }, 00:19:12.972 { 00:19:12.972 "name": "BaseBdev4", 00:19:12.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:12.972 "is_configured": false, 00:19:12.972 "data_offset": 0, 00:19:12.972 "data_size": 0 00:19:12.972 } 00:19:12.972 ] 00:19:12.972 }' 00:19:12.972 17:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:12.972 17:31:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:13.543 17:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:13.543 [2024-07-15 17:31:24.820587] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:13.543 [2024-07-15 17:31:24.820606] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10e76f0 name Existed_Raid, state configuring 00:19:13.543 17:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:13.803 [2024-07-15 17:31:25.017122] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:13.803 [2024-07-15 17:31:25.017150] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:13.803 [2024-07-15 17:31:25.017156] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:13.804 [2024-07-15 17:31:25.017161] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:13.804 [2024-07-15 17:31:25.017170] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:13.804 [2024-07-15 17:31:25.017176] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:13.804 [2024-07-15 17:31:25.017180] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:13.804 [2024-07-15 17:31:25.017186] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:13.804 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:14.063 [2024-07-15 17:31:25.216137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:14.063 BaseBdev1 00:19:14.063 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:14.063 17:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:14.063 17:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:14.063 17:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:14.063 17:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:14.063 17:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:14.063 17:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:14.323 [ 00:19:14.323 { 00:19:14.323 "name": "BaseBdev1", 00:19:14.323 "aliases": [ 00:19:14.323 "6441375a-79c4-4591-bb24-9cf3b0567d3b" 00:19:14.323 ], 00:19:14.323 "product_name": "Malloc disk", 00:19:14.323 "block_size": 512, 00:19:14.323 "num_blocks": 65536, 00:19:14.323 "uuid": "6441375a-79c4-4591-bb24-9cf3b0567d3b", 00:19:14.323 "assigned_rate_limits": { 00:19:14.323 "rw_ios_per_sec": 0, 00:19:14.323 "rw_mbytes_per_sec": 0, 00:19:14.323 "r_mbytes_per_sec": 0, 00:19:14.323 "w_mbytes_per_sec": 0 00:19:14.323 }, 00:19:14.323 "claimed": true, 00:19:14.323 "claim_type": "exclusive_write", 00:19:14.323 "zoned": false, 00:19:14.323 "supported_io_types": { 00:19:14.323 "read": true, 00:19:14.323 "write": true, 00:19:14.323 "unmap": true, 00:19:14.323 "flush": true, 00:19:14.323 "reset": true, 00:19:14.323 "nvme_admin": false, 00:19:14.323 "nvme_io": false, 00:19:14.323 "nvme_io_md": false, 00:19:14.323 "write_zeroes": true, 00:19:14.323 "zcopy": true, 00:19:14.323 "get_zone_info": false, 00:19:14.323 "zone_management": false, 00:19:14.323 "zone_append": false, 00:19:14.323 "compare": false, 00:19:14.323 "compare_and_write": false, 00:19:14.323 "abort": true, 00:19:14.323 "seek_hole": false, 00:19:14.323 "seek_data": false, 00:19:14.323 "copy": true, 00:19:14.323 "nvme_iov_md": false 00:19:14.323 }, 00:19:14.323 "memory_domains": [ 00:19:14.323 { 00:19:14.323 "dma_device_id": "system", 00:19:14.323 "dma_device_type": 1 00:19:14.323 }, 00:19:14.323 { 00:19:14.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:14.323 "dma_device_type": 2 00:19:14.323 } 00:19:14.323 ], 00:19:14.323 "driver_specific": {} 00:19:14.323 } 00:19:14.323 ] 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.323 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:14.583 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.583 "name": "Existed_Raid", 00:19:14.583 "uuid": "5955a487-c97a-48b6-93a6-21f01604a957", 00:19:14.583 "strip_size_kb": 0, 00:19:14.583 "state": "configuring", 00:19:14.583 "raid_level": "raid1", 00:19:14.583 "superblock": true, 00:19:14.583 "num_base_bdevs": 4, 00:19:14.583 "num_base_bdevs_discovered": 1, 00:19:14.583 "num_base_bdevs_operational": 4, 00:19:14.583 "base_bdevs_list": [ 00:19:14.583 { 00:19:14.583 "name": "BaseBdev1", 00:19:14.583 "uuid": "6441375a-79c4-4591-bb24-9cf3b0567d3b", 00:19:14.583 "is_configured": true, 00:19:14.583 "data_offset": 2048, 00:19:14.583 "data_size": 63488 00:19:14.583 }, 00:19:14.583 { 00:19:14.583 "name": "BaseBdev2", 00:19:14.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.583 "is_configured": false, 00:19:14.583 "data_offset": 0, 00:19:14.583 "data_size": 0 00:19:14.583 }, 00:19:14.583 { 00:19:14.583 "name": "BaseBdev3", 00:19:14.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.583 "is_configured": false, 00:19:14.583 "data_offset": 0, 00:19:14.583 "data_size": 0 00:19:14.583 }, 00:19:14.583 { 00:19:14.583 "name": "BaseBdev4", 00:19:14.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.583 "is_configured": false, 00:19:14.583 "data_offset": 0, 00:19:14.583 "data_size": 0 00:19:14.583 } 00:19:14.583 ] 00:19:14.583 }' 00:19:14.583 17:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.583 17:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:15.186 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:15.447 [2024-07-15 17:31:26.527450] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:15.447 [2024-07-15 17:31:26.527482] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10e6f60 name Existed_Raid, state configuring 00:19:15.447 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:15.447 [2024-07-15 17:31:26.723983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:15.447 [2024-07-15 17:31:26.725079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:15.447 [2024-07-15 17:31:26.725103] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:15.447 [2024-07-15 17:31:26.725110] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:15.447 [2024-07-15 17:31:26.725115] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:15.447 [2024-07-15 17:31:26.725120] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:15.447 [2024-07-15 17:31:26.725125] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:15.447 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:15.447 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:15.447 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:15.447 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.447 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.447 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:15.447 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.707 "name": "Existed_Raid", 00:19:15.707 "uuid": "f8e81831-a46d-42fe-9524-48c1c822395d", 00:19:15.707 "strip_size_kb": 0, 00:19:15.707 "state": "configuring", 00:19:15.707 "raid_level": "raid1", 00:19:15.707 "superblock": true, 00:19:15.707 "num_base_bdevs": 4, 00:19:15.707 "num_base_bdevs_discovered": 1, 00:19:15.707 "num_base_bdevs_operational": 4, 00:19:15.707 "base_bdevs_list": [ 00:19:15.707 { 00:19:15.707 "name": "BaseBdev1", 00:19:15.707 "uuid": "6441375a-79c4-4591-bb24-9cf3b0567d3b", 00:19:15.707 "is_configured": true, 00:19:15.707 "data_offset": 2048, 00:19:15.707 "data_size": 63488 00:19:15.707 }, 00:19:15.707 { 00:19:15.707 "name": "BaseBdev2", 00:19:15.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.707 "is_configured": false, 00:19:15.707 "data_offset": 0, 00:19:15.707 "data_size": 0 00:19:15.707 }, 00:19:15.707 { 00:19:15.707 "name": "BaseBdev3", 00:19:15.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.707 "is_configured": false, 00:19:15.707 "data_offset": 0, 00:19:15.707 "data_size": 0 00:19:15.707 }, 00:19:15.707 { 00:19:15.707 "name": "BaseBdev4", 00:19:15.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.707 "is_configured": false, 00:19:15.707 "data_offset": 0, 00:19:15.707 "data_size": 0 00:19:15.707 } 00:19:15.707 ] 00:19:15.707 }' 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.707 17:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.277 17:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:16.538 [2024-07-15 17:31:27.655118] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:16.538 BaseBdev2 00:19:16.538 17:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:16.538 17:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:16.538 17:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:16.538 17:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:16.538 17:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:16.538 17:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:16.538 17:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:16.799 17:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:16.799 [ 00:19:16.799 { 00:19:16.799 "name": "BaseBdev2", 00:19:16.799 "aliases": [ 00:19:16.799 "b2945200-02ec-499a-ae88-d6ce7373fa1d" 00:19:16.799 ], 00:19:16.799 "product_name": "Malloc disk", 00:19:16.799 "block_size": 512, 00:19:16.799 "num_blocks": 65536, 00:19:16.799 "uuid": "b2945200-02ec-499a-ae88-d6ce7373fa1d", 00:19:16.799 "assigned_rate_limits": { 00:19:16.799 "rw_ios_per_sec": 0, 00:19:16.799 "rw_mbytes_per_sec": 0, 00:19:16.799 "r_mbytes_per_sec": 0, 00:19:16.799 "w_mbytes_per_sec": 0 00:19:16.799 }, 00:19:16.799 "claimed": true, 00:19:16.799 "claim_type": "exclusive_write", 00:19:16.799 "zoned": false, 00:19:16.799 "supported_io_types": { 00:19:16.799 "read": true, 00:19:16.799 "write": true, 00:19:16.799 "unmap": true, 00:19:16.799 "flush": true, 00:19:16.799 "reset": true, 00:19:16.799 "nvme_admin": false, 00:19:16.799 "nvme_io": false, 00:19:16.799 "nvme_io_md": false, 00:19:16.799 "write_zeroes": true, 00:19:16.799 "zcopy": true, 00:19:16.799 "get_zone_info": false, 00:19:16.799 "zone_management": false, 00:19:16.799 "zone_append": false, 00:19:16.799 "compare": false, 00:19:16.799 "compare_and_write": false, 00:19:16.799 "abort": true, 00:19:16.799 "seek_hole": false, 00:19:16.799 "seek_data": false, 00:19:16.799 "copy": true, 00:19:16.799 "nvme_iov_md": false 00:19:16.799 }, 00:19:16.799 "memory_domains": [ 00:19:16.799 { 00:19:16.799 "dma_device_id": "system", 00:19:16.799 "dma_device_type": 1 00:19:16.799 }, 00:19:16.799 { 00:19:16.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.799 "dma_device_type": 2 00:19:16.799 } 00:19:16.799 ], 00:19:16.799 "driver_specific": {} 00:19:16.799 } 00:19:16.799 ] 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.799 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.058 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.058 "name": "Existed_Raid", 00:19:17.059 "uuid": "f8e81831-a46d-42fe-9524-48c1c822395d", 00:19:17.059 "strip_size_kb": 0, 00:19:17.059 "state": "configuring", 00:19:17.059 "raid_level": "raid1", 00:19:17.059 "superblock": true, 00:19:17.059 "num_base_bdevs": 4, 00:19:17.059 "num_base_bdevs_discovered": 2, 00:19:17.059 "num_base_bdevs_operational": 4, 00:19:17.059 "base_bdevs_list": [ 00:19:17.059 { 00:19:17.059 "name": "BaseBdev1", 00:19:17.059 "uuid": "6441375a-79c4-4591-bb24-9cf3b0567d3b", 00:19:17.059 "is_configured": true, 00:19:17.059 "data_offset": 2048, 00:19:17.059 "data_size": 63488 00:19:17.059 }, 00:19:17.059 { 00:19:17.059 "name": "BaseBdev2", 00:19:17.059 "uuid": "b2945200-02ec-499a-ae88-d6ce7373fa1d", 00:19:17.059 "is_configured": true, 00:19:17.059 "data_offset": 2048, 00:19:17.059 "data_size": 63488 00:19:17.059 }, 00:19:17.059 { 00:19:17.059 "name": "BaseBdev3", 00:19:17.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.059 "is_configured": false, 00:19:17.059 "data_offset": 0, 00:19:17.059 "data_size": 0 00:19:17.059 }, 00:19:17.059 { 00:19:17.059 "name": "BaseBdev4", 00:19:17.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.059 "is_configured": false, 00:19:17.059 "data_offset": 0, 00:19:17.059 "data_size": 0 00:19:17.059 } 00:19:17.059 ] 00:19:17.059 }' 00:19:17.059 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.059 17:31:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:17.628 17:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:17.889 [2024-07-15 17:31:29.011373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:17.889 BaseBdev3 00:19:17.889 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:17.889 17:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:17.889 17:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:17.889 17:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:17.889 17:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:17.889 17:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:17.889 17:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.148 17:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:18.148 [ 00:19:18.148 { 00:19:18.148 "name": "BaseBdev3", 00:19:18.148 "aliases": [ 00:19:18.148 "b07714c8-7933-4180-b9bd-d46847eec158" 00:19:18.148 ], 00:19:18.148 "product_name": "Malloc disk", 00:19:18.149 "block_size": 512, 00:19:18.149 "num_blocks": 65536, 00:19:18.149 "uuid": "b07714c8-7933-4180-b9bd-d46847eec158", 00:19:18.149 "assigned_rate_limits": { 00:19:18.149 "rw_ios_per_sec": 0, 00:19:18.149 "rw_mbytes_per_sec": 0, 00:19:18.149 "r_mbytes_per_sec": 0, 00:19:18.149 "w_mbytes_per_sec": 0 00:19:18.149 }, 00:19:18.149 "claimed": true, 00:19:18.149 "claim_type": "exclusive_write", 00:19:18.149 "zoned": false, 00:19:18.149 "supported_io_types": { 00:19:18.149 "read": true, 00:19:18.149 "write": true, 00:19:18.149 "unmap": true, 00:19:18.149 "flush": true, 00:19:18.149 "reset": true, 00:19:18.149 "nvme_admin": false, 00:19:18.149 "nvme_io": false, 00:19:18.149 "nvme_io_md": false, 00:19:18.149 "write_zeroes": true, 00:19:18.149 "zcopy": true, 00:19:18.149 "get_zone_info": false, 00:19:18.149 "zone_management": false, 00:19:18.149 "zone_append": false, 00:19:18.149 "compare": false, 00:19:18.149 "compare_and_write": false, 00:19:18.149 "abort": true, 00:19:18.149 "seek_hole": false, 00:19:18.149 "seek_data": false, 00:19:18.149 "copy": true, 00:19:18.149 "nvme_iov_md": false 00:19:18.149 }, 00:19:18.149 "memory_domains": [ 00:19:18.149 { 00:19:18.149 "dma_device_id": "system", 00:19:18.149 "dma_device_type": 1 00:19:18.149 }, 00:19:18.149 { 00:19:18.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.149 "dma_device_type": 2 00:19:18.149 } 00:19:18.149 ], 00:19:18.149 "driver_specific": {} 00:19:18.149 } 00:19:18.149 ] 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.149 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.409 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.409 "name": "Existed_Raid", 00:19:18.409 "uuid": "f8e81831-a46d-42fe-9524-48c1c822395d", 00:19:18.409 "strip_size_kb": 0, 00:19:18.409 "state": "configuring", 00:19:18.409 "raid_level": "raid1", 00:19:18.409 "superblock": true, 00:19:18.409 "num_base_bdevs": 4, 00:19:18.409 "num_base_bdevs_discovered": 3, 00:19:18.409 "num_base_bdevs_operational": 4, 00:19:18.409 "base_bdevs_list": [ 00:19:18.409 { 00:19:18.409 "name": "BaseBdev1", 00:19:18.409 "uuid": "6441375a-79c4-4591-bb24-9cf3b0567d3b", 00:19:18.409 "is_configured": true, 00:19:18.409 "data_offset": 2048, 00:19:18.409 "data_size": 63488 00:19:18.409 }, 00:19:18.409 { 00:19:18.409 "name": "BaseBdev2", 00:19:18.409 "uuid": "b2945200-02ec-499a-ae88-d6ce7373fa1d", 00:19:18.409 "is_configured": true, 00:19:18.409 "data_offset": 2048, 00:19:18.410 "data_size": 63488 00:19:18.410 }, 00:19:18.410 { 00:19:18.410 "name": "BaseBdev3", 00:19:18.410 "uuid": "b07714c8-7933-4180-b9bd-d46847eec158", 00:19:18.410 "is_configured": true, 00:19:18.410 "data_offset": 2048, 00:19:18.410 "data_size": 63488 00:19:18.410 }, 00:19:18.410 { 00:19:18.410 "name": "BaseBdev4", 00:19:18.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.410 "is_configured": false, 00:19:18.410 "data_offset": 0, 00:19:18.410 "data_size": 0 00:19:18.410 } 00:19:18.410 ] 00:19:18.410 }' 00:19:18.410 17:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.410 17:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:18.979 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:19.240 [2024-07-15 17:31:30.311430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:19.240 [2024-07-15 17:31:30.311565] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10e7fc0 00:19:19.240 [2024-07-15 17:31:30.311573] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:19.240 [2024-07-15 17:31:30.311708] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10e7c00 00:19:19.240 [2024-07-15 17:31:30.311820] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10e7fc0 00:19:19.240 [2024-07-15 17:31:30.311826] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10e7fc0 00:19:19.240 [2024-07-15 17:31:30.311895] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:19.240 BaseBdev4 00:19:19.240 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:19.240 17:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:19.240 17:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:19.240 17:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:19.240 17:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:19.240 17:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:19.240 17:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:19.240 17:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:19.500 [ 00:19:19.500 { 00:19:19.500 "name": "BaseBdev4", 00:19:19.500 "aliases": [ 00:19:19.500 "f069a971-2fd9-4e64-9176-d9e8434fde68" 00:19:19.500 ], 00:19:19.500 "product_name": "Malloc disk", 00:19:19.500 "block_size": 512, 00:19:19.500 "num_blocks": 65536, 00:19:19.500 "uuid": "f069a971-2fd9-4e64-9176-d9e8434fde68", 00:19:19.500 "assigned_rate_limits": { 00:19:19.500 "rw_ios_per_sec": 0, 00:19:19.500 "rw_mbytes_per_sec": 0, 00:19:19.500 "r_mbytes_per_sec": 0, 00:19:19.500 "w_mbytes_per_sec": 0 00:19:19.500 }, 00:19:19.500 "claimed": true, 00:19:19.500 "claim_type": "exclusive_write", 00:19:19.500 "zoned": false, 00:19:19.500 "supported_io_types": { 00:19:19.500 "read": true, 00:19:19.500 "write": true, 00:19:19.500 "unmap": true, 00:19:19.500 "flush": true, 00:19:19.500 "reset": true, 00:19:19.500 "nvme_admin": false, 00:19:19.500 "nvme_io": false, 00:19:19.500 "nvme_io_md": false, 00:19:19.500 "write_zeroes": true, 00:19:19.500 "zcopy": true, 00:19:19.500 "get_zone_info": false, 00:19:19.500 "zone_management": false, 00:19:19.500 "zone_append": false, 00:19:19.500 "compare": false, 00:19:19.500 "compare_and_write": false, 00:19:19.500 "abort": true, 00:19:19.500 "seek_hole": false, 00:19:19.500 "seek_data": false, 00:19:19.500 "copy": true, 00:19:19.500 "nvme_iov_md": false 00:19:19.500 }, 00:19:19.500 "memory_domains": [ 00:19:19.500 { 00:19:19.500 "dma_device_id": "system", 00:19:19.500 "dma_device_type": 1 00:19:19.500 }, 00:19:19.500 { 00:19:19.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.500 "dma_device_type": 2 00:19:19.500 } 00:19:19.500 ], 00:19:19.500 "driver_specific": {} 00:19:19.500 } 00:19:19.500 ] 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.500 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.761 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.761 "name": "Existed_Raid", 00:19:19.761 "uuid": "f8e81831-a46d-42fe-9524-48c1c822395d", 00:19:19.761 "strip_size_kb": 0, 00:19:19.761 "state": "online", 00:19:19.761 "raid_level": "raid1", 00:19:19.761 "superblock": true, 00:19:19.761 "num_base_bdevs": 4, 00:19:19.761 "num_base_bdevs_discovered": 4, 00:19:19.761 "num_base_bdevs_operational": 4, 00:19:19.761 "base_bdevs_list": [ 00:19:19.761 { 00:19:19.761 "name": "BaseBdev1", 00:19:19.761 "uuid": "6441375a-79c4-4591-bb24-9cf3b0567d3b", 00:19:19.761 "is_configured": true, 00:19:19.761 "data_offset": 2048, 00:19:19.761 "data_size": 63488 00:19:19.761 }, 00:19:19.761 { 00:19:19.761 "name": "BaseBdev2", 00:19:19.761 "uuid": "b2945200-02ec-499a-ae88-d6ce7373fa1d", 00:19:19.761 "is_configured": true, 00:19:19.761 "data_offset": 2048, 00:19:19.761 "data_size": 63488 00:19:19.761 }, 00:19:19.761 { 00:19:19.761 "name": "BaseBdev3", 00:19:19.761 "uuid": "b07714c8-7933-4180-b9bd-d46847eec158", 00:19:19.761 "is_configured": true, 00:19:19.761 "data_offset": 2048, 00:19:19.761 "data_size": 63488 00:19:19.761 }, 00:19:19.761 { 00:19:19.761 "name": "BaseBdev4", 00:19:19.761 "uuid": "f069a971-2fd9-4e64-9176-d9e8434fde68", 00:19:19.761 "is_configured": true, 00:19:19.761 "data_offset": 2048, 00:19:19.761 "data_size": 63488 00:19:19.761 } 00:19:19.761 ] 00:19:19.761 }' 00:19:19.761 17:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.761 17:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:20.332 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:20.332 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:20.332 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:20.332 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:20.332 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:20.332 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:20.332 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:20.332 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:20.332 [2024-07-15 17:31:31.606977] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:20.332 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:20.332 "name": "Existed_Raid", 00:19:20.332 "aliases": [ 00:19:20.332 "f8e81831-a46d-42fe-9524-48c1c822395d" 00:19:20.332 ], 00:19:20.332 "product_name": "Raid Volume", 00:19:20.332 "block_size": 512, 00:19:20.332 "num_blocks": 63488, 00:19:20.332 "uuid": "f8e81831-a46d-42fe-9524-48c1c822395d", 00:19:20.332 "assigned_rate_limits": { 00:19:20.332 "rw_ios_per_sec": 0, 00:19:20.332 "rw_mbytes_per_sec": 0, 00:19:20.332 "r_mbytes_per_sec": 0, 00:19:20.332 "w_mbytes_per_sec": 0 00:19:20.332 }, 00:19:20.332 "claimed": false, 00:19:20.332 "zoned": false, 00:19:20.332 "supported_io_types": { 00:19:20.332 "read": true, 00:19:20.332 "write": true, 00:19:20.332 "unmap": false, 00:19:20.332 "flush": false, 00:19:20.332 "reset": true, 00:19:20.332 "nvme_admin": false, 00:19:20.332 "nvme_io": false, 00:19:20.332 "nvme_io_md": false, 00:19:20.332 "write_zeroes": true, 00:19:20.332 "zcopy": false, 00:19:20.332 "get_zone_info": false, 00:19:20.332 "zone_management": false, 00:19:20.332 "zone_append": false, 00:19:20.332 "compare": false, 00:19:20.332 "compare_and_write": false, 00:19:20.332 "abort": false, 00:19:20.332 "seek_hole": false, 00:19:20.332 "seek_data": false, 00:19:20.332 "copy": false, 00:19:20.332 "nvme_iov_md": false 00:19:20.332 }, 00:19:20.332 "memory_domains": [ 00:19:20.332 { 00:19:20.332 "dma_device_id": "system", 00:19:20.332 "dma_device_type": 1 00:19:20.332 }, 00:19:20.332 { 00:19:20.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.332 "dma_device_type": 2 00:19:20.332 }, 00:19:20.332 { 00:19:20.332 "dma_device_id": "system", 00:19:20.332 "dma_device_type": 1 00:19:20.332 }, 00:19:20.332 { 00:19:20.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.332 "dma_device_type": 2 00:19:20.332 }, 00:19:20.332 { 00:19:20.332 "dma_device_id": "system", 00:19:20.332 "dma_device_type": 1 00:19:20.332 }, 00:19:20.332 { 00:19:20.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.332 "dma_device_type": 2 00:19:20.333 }, 00:19:20.333 { 00:19:20.333 "dma_device_id": "system", 00:19:20.333 "dma_device_type": 1 00:19:20.333 }, 00:19:20.333 { 00:19:20.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.333 "dma_device_type": 2 00:19:20.333 } 00:19:20.333 ], 00:19:20.333 "driver_specific": { 00:19:20.333 "raid": { 00:19:20.333 "uuid": "f8e81831-a46d-42fe-9524-48c1c822395d", 00:19:20.333 "strip_size_kb": 0, 00:19:20.333 "state": "online", 00:19:20.333 "raid_level": "raid1", 00:19:20.333 "superblock": true, 00:19:20.333 "num_base_bdevs": 4, 00:19:20.333 "num_base_bdevs_discovered": 4, 00:19:20.333 "num_base_bdevs_operational": 4, 00:19:20.333 "base_bdevs_list": [ 00:19:20.333 { 00:19:20.333 "name": "BaseBdev1", 00:19:20.333 "uuid": "6441375a-79c4-4591-bb24-9cf3b0567d3b", 00:19:20.333 "is_configured": true, 00:19:20.333 "data_offset": 2048, 00:19:20.333 "data_size": 63488 00:19:20.333 }, 00:19:20.333 { 00:19:20.333 "name": "BaseBdev2", 00:19:20.333 "uuid": "b2945200-02ec-499a-ae88-d6ce7373fa1d", 00:19:20.333 "is_configured": true, 00:19:20.333 "data_offset": 2048, 00:19:20.333 "data_size": 63488 00:19:20.333 }, 00:19:20.333 { 00:19:20.333 "name": "BaseBdev3", 00:19:20.333 "uuid": "b07714c8-7933-4180-b9bd-d46847eec158", 00:19:20.333 "is_configured": true, 00:19:20.333 "data_offset": 2048, 00:19:20.333 "data_size": 63488 00:19:20.333 }, 00:19:20.333 { 00:19:20.333 "name": "BaseBdev4", 00:19:20.333 "uuid": "f069a971-2fd9-4e64-9176-d9e8434fde68", 00:19:20.333 "is_configured": true, 00:19:20.333 "data_offset": 2048, 00:19:20.333 "data_size": 63488 00:19:20.333 } 00:19:20.333 ] 00:19:20.333 } 00:19:20.333 } 00:19:20.333 }' 00:19:20.333 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:20.593 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:20.593 BaseBdev2 00:19:20.593 BaseBdev3 00:19:20.594 BaseBdev4' 00:19:20.594 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:20.594 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:20.594 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:20.854 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:20.854 "name": "BaseBdev1", 00:19:20.854 "aliases": [ 00:19:20.854 "6441375a-79c4-4591-bb24-9cf3b0567d3b" 00:19:20.854 ], 00:19:20.854 "product_name": "Malloc disk", 00:19:20.854 "block_size": 512, 00:19:20.854 "num_blocks": 65536, 00:19:20.854 "uuid": "6441375a-79c4-4591-bb24-9cf3b0567d3b", 00:19:20.854 "assigned_rate_limits": { 00:19:20.854 "rw_ios_per_sec": 0, 00:19:20.854 "rw_mbytes_per_sec": 0, 00:19:20.854 "r_mbytes_per_sec": 0, 00:19:20.854 "w_mbytes_per_sec": 0 00:19:20.854 }, 00:19:20.854 "claimed": true, 00:19:20.854 "claim_type": "exclusive_write", 00:19:20.854 "zoned": false, 00:19:20.854 "supported_io_types": { 00:19:20.854 "read": true, 00:19:20.854 "write": true, 00:19:20.854 "unmap": true, 00:19:20.854 "flush": true, 00:19:20.854 "reset": true, 00:19:20.854 "nvme_admin": false, 00:19:20.854 "nvme_io": false, 00:19:20.854 "nvme_io_md": false, 00:19:20.854 "write_zeroes": true, 00:19:20.854 "zcopy": true, 00:19:20.854 "get_zone_info": false, 00:19:20.854 "zone_management": false, 00:19:20.854 "zone_append": false, 00:19:20.854 "compare": false, 00:19:20.854 "compare_and_write": false, 00:19:20.854 "abort": true, 00:19:20.854 "seek_hole": false, 00:19:20.854 "seek_data": false, 00:19:20.854 "copy": true, 00:19:20.854 "nvme_iov_md": false 00:19:20.854 }, 00:19:20.854 "memory_domains": [ 00:19:20.854 { 00:19:20.854 "dma_device_id": "system", 00:19:20.854 "dma_device_type": 1 00:19:20.854 }, 00:19:20.854 { 00:19:20.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.854 "dma_device_type": 2 00:19:20.854 } 00:19:20.854 ], 00:19:20.854 "driver_specific": {} 00:19:20.854 }' 00:19:20.854 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:20.854 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:20.854 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:20.854 17:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.854 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.854 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:20.854 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:20.854 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.115 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:21.115 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.115 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.115 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:21.115 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:21.115 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:21.115 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:21.375 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:21.375 "name": "BaseBdev2", 00:19:21.375 "aliases": [ 00:19:21.375 "b2945200-02ec-499a-ae88-d6ce7373fa1d" 00:19:21.375 ], 00:19:21.375 "product_name": "Malloc disk", 00:19:21.375 "block_size": 512, 00:19:21.375 "num_blocks": 65536, 00:19:21.375 "uuid": "b2945200-02ec-499a-ae88-d6ce7373fa1d", 00:19:21.375 "assigned_rate_limits": { 00:19:21.375 "rw_ios_per_sec": 0, 00:19:21.375 "rw_mbytes_per_sec": 0, 00:19:21.375 "r_mbytes_per_sec": 0, 00:19:21.375 "w_mbytes_per_sec": 0 00:19:21.375 }, 00:19:21.375 "claimed": true, 00:19:21.375 "claim_type": "exclusive_write", 00:19:21.375 "zoned": false, 00:19:21.375 "supported_io_types": { 00:19:21.375 "read": true, 00:19:21.375 "write": true, 00:19:21.375 "unmap": true, 00:19:21.375 "flush": true, 00:19:21.375 "reset": true, 00:19:21.375 "nvme_admin": false, 00:19:21.375 "nvme_io": false, 00:19:21.375 "nvme_io_md": false, 00:19:21.375 "write_zeroes": true, 00:19:21.375 "zcopy": true, 00:19:21.375 "get_zone_info": false, 00:19:21.375 "zone_management": false, 00:19:21.375 "zone_append": false, 00:19:21.375 "compare": false, 00:19:21.375 "compare_and_write": false, 00:19:21.375 "abort": true, 00:19:21.375 "seek_hole": false, 00:19:21.375 "seek_data": false, 00:19:21.375 "copy": true, 00:19:21.375 "nvme_iov_md": false 00:19:21.375 }, 00:19:21.375 "memory_domains": [ 00:19:21.375 { 00:19:21.375 "dma_device_id": "system", 00:19:21.375 "dma_device_type": 1 00:19:21.375 }, 00:19:21.375 { 00:19:21.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.375 "dma_device_type": 2 00:19:21.375 } 00:19:21.375 ], 00:19:21.375 "driver_specific": {} 00:19:21.375 }' 00:19:21.375 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.375 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.375 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:21.375 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.375 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.375 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:21.375 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.375 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.634 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:21.634 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.634 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.634 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:21.634 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:21.634 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:21.634 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:21.905 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:21.905 "name": "BaseBdev3", 00:19:21.905 "aliases": [ 00:19:21.905 "b07714c8-7933-4180-b9bd-d46847eec158" 00:19:21.905 ], 00:19:21.905 "product_name": "Malloc disk", 00:19:21.905 "block_size": 512, 00:19:21.905 "num_blocks": 65536, 00:19:21.905 "uuid": "b07714c8-7933-4180-b9bd-d46847eec158", 00:19:21.905 "assigned_rate_limits": { 00:19:21.905 "rw_ios_per_sec": 0, 00:19:21.905 "rw_mbytes_per_sec": 0, 00:19:21.905 "r_mbytes_per_sec": 0, 00:19:21.905 "w_mbytes_per_sec": 0 00:19:21.905 }, 00:19:21.905 "claimed": true, 00:19:21.905 "claim_type": "exclusive_write", 00:19:21.905 "zoned": false, 00:19:21.905 "supported_io_types": { 00:19:21.905 "read": true, 00:19:21.905 "write": true, 00:19:21.905 "unmap": true, 00:19:21.905 "flush": true, 00:19:21.905 "reset": true, 00:19:21.905 "nvme_admin": false, 00:19:21.905 "nvme_io": false, 00:19:21.905 "nvme_io_md": false, 00:19:21.905 "write_zeroes": true, 00:19:21.905 "zcopy": true, 00:19:21.905 "get_zone_info": false, 00:19:21.905 "zone_management": false, 00:19:21.905 "zone_append": false, 00:19:21.905 "compare": false, 00:19:21.905 "compare_and_write": false, 00:19:21.905 "abort": true, 00:19:21.905 "seek_hole": false, 00:19:21.905 "seek_data": false, 00:19:21.905 "copy": true, 00:19:21.905 "nvme_iov_md": false 00:19:21.905 }, 00:19:21.905 "memory_domains": [ 00:19:21.905 { 00:19:21.905 "dma_device_id": "system", 00:19:21.905 "dma_device_type": 1 00:19:21.905 }, 00:19:21.905 { 00:19:21.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.905 "dma_device_type": 2 00:19:21.905 } 00:19:21.905 ], 00:19:21.905 "driver_specific": {} 00:19:21.905 }' 00:19:21.905 17:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.905 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.905 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:21.905 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.905 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.905 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:21.905 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.905 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:22.166 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:22.166 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.166 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.166 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:22.166 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:22.166 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:22.166 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:22.427 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:22.427 "name": "BaseBdev4", 00:19:22.427 "aliases": [ 00:19:22.427 "f069a971-2fd9-4e64-9176-d9e8434fde68" 00:19:22.427 ], 00:19:22.427 "product_name": "Malloc disk", 00:19:22.427 "block_size": 512, 00:19:22.427 "num_blocks": 65536, 00:19:22.427 "uuid": "f069a971-2fd9-4e64-9176-d9e8434fde68", 00:19:22.427 "assigned_rate_limits": { 00:19:22.427 "rw_ios_per_sec": 0, 00:19:22.427 "rw_mbytes_per_sec": 0, 00:19:22.427 "r_mbytes_per_sec": 0, 00:19:22.427 "w_mbytes_per_sec": 0 00:19:22.427 }, 00:19:22.427 "claimed": true, 00:19:22.427 "claim_type": "exclusive_write", 00:19:22.427 "zoned": false, 00:19:22.427 "supported_io_types": { 00:19:22.427 "read": true, 00:19:22.427 "write": true, 00:19:22.427 "unmap": true, 00:19:22.427 "flush": true, 00:19:22.427 "reset": true, 00:19:22.427 "nvme_admin": false, 00:19:22.427 "nvme_io": false, 00:19:22.427 "nvme_io_md": false, 00:19:22.427 "write_zeroes": true, 00:19:22.427 "zcopy": true, 00:19:22.427 "get_zone_info": false, 00:19:22.427 "zone_management": false, 00:19:22.427 "zone_append": false, 00:19:22.427 "compare": false, 00:19:22.427 "compare_and_write": false, 00:19:22.427 "abort": true, 00:19:22.427 "seek_hole": false, 00:19:22.427 "seek_data": false, 00:19:22.427 "copy": true, 00:19:22.427 "nvme_iov_md": false 00:19:22.427 }, 00:19:22.427 "memory_domains": [ 00:19:22.427 { 00:19:22.427 "dma_device_id": "system", 00:19:22.427 "dma_device_type": 1 00:19:22.427 }, 00:19:22.427 { 00:19:22.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.427 "dma_device_type": 2 00:19:22.427 } 00:19:22.427 ], 00:19:22.427 "driver_specific": {} 00:19:22.427 }' 00:19:22.427 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.427 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.427 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:22.427 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:22.427 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:22.427 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:22.427 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:22.687 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:22.687 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:22.687 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.687 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.687 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:22.687 17:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:22.947 [2024-07-15 17:31:34.064998] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.947 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:23.208 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.208 "name": "Existed_Raid", 00:19:23.208 "uuid": "f8e81831-a46d-42fe-9524-48c1c822395d", 00:19:23.208 "strip_size_kb": 0, 00:19:23.208 "state": "online", 00:19:23.208 "raid_level": "raid1", 00:19:23.208 "superblock": true, 00:19:23.208 "num_base_bdevs": 4, 00:19:23.208 "num_base_bdevs_discovered": 3, 00:19:23.208 "num_base_bdevs_operational": 3, 00:19:23.208 "base_bdevs_list": [ 00:19:23.208 { 00:19:23.208 "name": null, 00:19:23.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:23.208 "is_configured": false, 00:19:23.208 "data_offset": 2048, 00:19:23.208 "data_size": 63488 00:19:23.208 }, 00:19:23.208 { 00:19:23.208 "name": "BaseBdev2", 00:19:23.208 "uuid": "b2945200-02ec-499a-ae88-d6ce7373fa1d", 00:19:23.208 "is_configured": true, 00:19:23.208 "data_offset": 2048, 00:19:23.208 "data_size": 63488 00:19:23.208 }, 00:19:23.208 { 00:19:23.208 "name": "BaseBdev3", 00:19:23.208 "uuid": "b07714c8-7933-4180-b9bd-d46847eec158", 00:19:23.208 "is_configured": true, 00:19:23.208 "data_offset": 2048, 00:19:23.208 "data_size": 63488 00:19:23.208 }, 00:19:23.208 { 00:19:23.208 "name": "BaseBdev4", 00:19:23.208 "uuid": "f069a971-2fd9-4e64-9176-d9e8434fde68", 00:19:23.208 "is_configured": true, 00:19:23.209 "data_offset": 2048, 00:19:23.209 "data_size": 63488 00:19:23.209 } 00:19:23.209 ] 00:19:23.209 }' 00:19:23.209 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.209 17:31:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.780 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:23.780 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:23.780 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.780 17:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:23.780 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:23.780 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:23.780 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:24.040 [2024-07-15 17:31:35.187848] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:24.040 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:24.040 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:24.040 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.040 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:24.300 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:24.300 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:24.300 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:24.300 [2024-07-15 17:31:35.574542] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:24.300 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:24.300 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:24.300 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.300 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:24.561 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:24.561 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:24.561 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:24.821 [2024-07-15 17:31:35.949347] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:24.821 [2024-07-15 17:31:35.949407] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:24.821 [2024-07-15 17:31:35.955368] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:24.821 [2024-07-15 17:31:35.955399] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:24.821 [2024-07-15 17:31:35.955405] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10e7fc0 name Existed_Raid, state offline 00:19:24.821 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:24.821 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:24.821 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.821 17:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:25.082 BaseBdev2 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:25.082 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:25.342 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:25.603 [ 00:19:25.603 { 00:19:25.603 "name": "BaseBdev2", 00:19:25.603 "aliases": [ 00:19:25.603 "53050721-cc41-4fdd-917a-6215f8041ffd" 00:19:25.603 ], 00:19:25.603 "product_name": "Malloc disk", 00:19:25.603 "block_size": 512, 00:19:25.603 "num_blocks": 65536, 00:19:25.603 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:25.603 "assigned_rate_limits": { 00:19:25.603 "rw_ios_per_sec": 0, 00:19:25.603 "rw_mbytes_per_sec": 0, 00:19:25.603 "r_mbytes_per_sec": 0, 00:19:25.603 "w_mbytes_per_sec": 0 00:19:25.603 }, 00:19:25.603 "claimed": false, 00:19:25.603 "zoned": false, 00:19:25.603 "supported_io_types": { 00:19:25.603 "read": true, 00:19:25.603 "write": true, 00:19:25.603 "unmap": true, 00:19:25.603 "flush": true, 00:19:25.603 "reset": true, 00:19:25.603 "nvme_admin": false, 00:19:25.603 "nvme_io": false, 00:19:25.603 "nvme_io_md": false, 00:19:25.603 "write_zeroes": true, 00:19:25.603 "zcopy": true, 00:19:25.603 "get_zone_info": false, 00:19:25.603 "zone_management": false, 00:19:25.603 "zone_append": false, 00:19:25.603 "compare": false, 00:19:25.603 "compare_and_write": false, 00:19:25.603 "abort": true, 00:19:25.603 "seek_hole": false, 00:19:25.603 "seek_data": false, 00:19:25.603 "copy": true, 00:19:25.603 "nvme_iov_md": false 00:19:25.603 }, 00:19:25.603 "memory_domains": [ 00:19:25.603 { 00:19:25.603 "dma_device_id": "system", 00:19:25.603 "dma_device_type": 1 00:19:25.603 }, 00:19:25.603 { 00:19:25.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.603 "dma_device_type": 2 00:19:25.603 } 00:19:25.603 ], 00:19:25.603 "driver_specific": {} 00:19:25.603 } 00:19:25.603 ] 00:19:25.603 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:25.603 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:25.603 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:25.603 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:25.603 BaseBdev3 00:19:25.864 17:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:25.864 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:25.864 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:25.864 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:25.864 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:25.864 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:25.864 17:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:25.864 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:26.124 [ 00:19:26.124 { 00:19:26.124 "name": "BaseBdev3", 00:19:26.124 "aliases": [ 00:19:26.124 "0f15ee6a-94c1-4479-8798-0929f98bedf2" 00:19:26.124 ], 00:19:26.124 "product_name": "Malloc disk", 00:19:26.124 "block_size": 512, 00:19:26.124 "num_blocks": 65536, 00:19:26.124 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:26.124 "assigned_rate_limits": { 00:19:26.124 "rw_ios_per_sec": 0, 00:19:26.124 "rw_mbytes_per_sec": 0, 00:19:26.124 "r_mbytes_per_sec": 0, 00:19:26.124 "w_mbytes_per_sec": 0 00:19:26.124 }, 00:19:26.124 "claimed": false, 00:19:26.124 "zoned": false, 00:19:26.124 "supported_io_types": { 00:19:26.124 "read": true, 00:19:26.124 "write": true, 00:19:26.124 "unmap": true, 00:19:26.124 "flush": true, 00:19:26.124 "reset": true, 00:19:26.124 "nvme_admin": false, 00:19:26.124 "nvme_io": false, 00:19:26.124 "nvme_io_md": false, 00:19:26.124 "write_zeroes": true, 00:19:26.124 "zcopy": true, 00:19:26.124 "get_zone_info": false, 00:19:26.124 "zone_management": false, 00:19:26.124 "zone_append": false, 00:19:26.124 "compare": false, 00:19:26.124 "compare_and_write": false, 00:19:26.124 "abort": true, 00:19:26.124 "seek_hole": false, 00:19:26.124 "seek_data": false, 00:19:26.124 "copy": true, 00:19:26.124 "nvme_iov_md": false 00:19:26.124 }, 00:19:26.124 "memory_domains": [ 00:19:26.124 { 00:19:26.124 "dma_device_id": "system", 00:19:26.124 "dma_device_type": 1 00:19:26.124 }, 00:19:26.124 { 00:19:26.124 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:26.124 "dma_device_type": 2 00:19:26.124 } 00:19:26.124 ], 00:19:26.124 "driver_specific": {} 00:19:26.124 } 00:19:26.124 ] 00:19:26.124 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:26.124 17:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:26.124 17:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:26.124 17:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:26.384 BaseBdev4 00:19:26.384 17:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:26.384 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:26.384 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:26.384 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:26.384 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:26.384 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:26.384 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:26.643 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:26.643 [ 00:19:26.643 { 00:19:26.643 "name": "BaseBdev4", 00:19:26.643 "aliases": [ 00:19:26.643 "97fda5d0-da90-492a-a51a-5ab6c005825e" 00:19:26.643 ], 00:19:26.643 "product_name": "Malloc disk", 00:19:26.643 "block_size": 512, 00:19:26.643 "num_blocks": 65536, 00:19:26.643 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:26.643 "assigned_rate_limits": { 00:19:26.643 "rw_ios_per_sec": 0, 00:19:26.643 "rw_mbytes_per_sec": 0, 00:19:26.643 "r_mbytes_per_sec": 0, 00:19:26.643 "w_mbytes_per_sec": 0 00:19:26.643 }, 00:19:26.643 "claimed": false, 00:19:26.643 "zoned": false, 00:19:26.643 "supported_io_types": { 00:19:26.643 "read": true, 00:19:26.643 "write": true, 00:19:26.643 "unmap": true, 00:19:26.643 "flush": true, 00:19:26.643 "reset": true, 00:19:26.643 "nvme_admin": false, 00:19:26.643 "nvme_io": false, 00:19:26.643 "nvme_io_md": false, 00:19:26.643 "write_zeroes": true, 00:19:26.643 "zcopy": true, 00:19:26.643 "get_zone_info": false, 00:19:26.643 "zone_management": false, 00:19:26.643 "zone_append": false, 00:19:26.643 "compare": false, 00:19:26.643 "compare_and_write": false, 00:19:26.643 "abort": true, 00:19:26.643 "seek_hole": false, 00:19:26.643 "seek_data": false, 00:19:26.643 "copy": true, 00:19:26.643 "nvme_iov_md": false 00:19:26.643 }, 00:19:26.643 "memory_domains": [ 00:19:26.643 { 00:19:26.643 "dma_device_id": "system", 00:19:26.643 "dma_device_type": 1 00:19:26.643 }, 00:19:26.643 { 00:19:26.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:26.643 "dma_device_type": 2 00:19:26.643 } 00:19:26.643 ], 00:19:26.643 "driver_specific": {} 00:19:26.643 } 00:19:26.643 ] 00:19:26.643 17:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:26.643 17:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:26.643 17:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:26.643 17:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:26.903 [2024-07-15 17:31:38.064306] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:26.903 [2024-07-15 17:31:38.064338] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:26.903 [2024-07-15 17:31:38.064353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:26.903 [2024-07-15 17:31:38.065408] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:26.903 [2024-07-15 17:31:38.065444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.903 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:27.163 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.163 "name": "Existed_Raid", 00:19:27.163 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:27.163 "strip_size_kb": 0, 00:19:27.163 "state": "configuring", 00:19:27.163 "raid_level": "raid1", 00:19:27.163 "superblock": true, 00:19:27.163 "num_base_bdevs": 4, 00:19:27.163 "num_base_bdevs_discovered": 3, 00:19:27.163 "num_base_bdevs_operational": 4, 00:19:27.163 "base_bdevs_list": [ 00:19:27.163 { 00:19:27.163 "name": "BaseBdev1", 00:19:27.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.163 "is_configured": false, 00:19:27.163 "data_offset": 0, 00:19:27.163 "data_size": 0 00:19:27.163 }, 00:19:27.163 { 00:19:27.163 "name": "BaseBdev2", 00:19:27.163 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:27.163 "is_configured": true, 00:19:27.163 "data_offset": 2048, 00:19:27.163 "data_size": 63488 00:19:27.163 }, 00:19:27.163 { 00:19:27.163 "name": "BaseBdev3", 00:19:27.163 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:27.163 "is_configured": true, 00:19:27.163 "data_offset": 2048, 00:19:27.163 "data_size": 63488 00:19:27.163 }, 00:19:27.163 { 00:19:27.163 "name": "BaseBdev4", 00:19:27.163 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:27.163 "is_configured": true, 00:19:27.163 "data_offset": 2048, 00:19:27.163 "data_size": 63488 00:19:27.163 } 00:19:27.163 ] 00:19:27.163 }' 00:19:27.163 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.163 17:31:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:27.732 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:27.732 [2024-07-15 17:31:38.982600] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:27.732 17:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:27.732 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.992 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.992 "name": "Existed_Raid", 00:19:27.992 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:27.992 "strip_size_kb": 0, 00:19:27.992 "state": "configuring", 00:19:27.992 "raid_level": "raid1", 00:19:27.992 "superblock": true, 00:19:27.992 "num_base_bdevs": 4, 00:19:27.992 "num_base_bdevs_discovered": 2, 00:19:27.992 "num_base_bdevs_operational": 4, 00:19:27.992 "base_bdevs_list": [ 00:19:27.992 { 00:19:27.992 "name": "BaseBdev1", 00:19:27.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.992 "is_configured": false, 00:19:27.992 "data_offset": 0, 00:19:27.992 "data_size": 0 00:19:27.992 }, 00:19:27.992 { 00:19:27.992 "name": null, 00:19:27.992 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:27.992 "is_configured": false, 00:19:27.992 "data_offset": 2048, 00:19:27.992 "data_size": 63488 00:19:27.992 }, 00:19:27.992 { 00:19:27.992 "name": "BaseBdev3", 00:19:27.992 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:27.992 "is_configured": true, 00:19:27.992 "data_offset": 2048, 00:19:27.992 "data_size": 63488 00:19:27.992 }, 00:19:27.992 { 00:19:27.992 "name": "BaseBdev4", 00:19:27.992 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:27.992 "is_configured": true, 00:19:27.992 "data_offset": 2048, 00:19:27.992 "data_size": 63488 00:19:27.992 } 00:19:27.992 ] 00:19:27.992 }' 00:19:27.992 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.992 17:31:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:28.561 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.561 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:28.822 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:28.822 17:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:28.822 [2024-07-15 17:31:40.114368] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:28.822 BaseBdev1 00:19:29.098 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:29.098 17:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:29.098 17:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:29.098 17:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:29.098 17:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:29.098 17:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:29.098 17:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:29.098 17:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:29.364 [ 00:19:29.364 { 00:19:29.364 "name": "BaseBdev1", 00:19:29.364 "aliases": [ 00:19:29.364 "019ccb61-3c76-4a0a-a537-a014f418c15c" 00:19:29.364 ], 00:19:29.364 "product_name": "Malloc disk", 00:19:29.364 "block_size": 512, 00:19:29.364 "num_blocks": 65536, 00:19:29.364 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:29.364 "assigned_rate_limits": { 00:19:29.364 "rw_ios_per_sec": 0, 00:19:29.364 "rw_mbytes_per_sec": 0, 00:19:29.364 "r_mbytes_per_sec": 0, 00:19:29.364 "w_mbytes_per_sec": 0 00:19:29.364 }, 00:19:29.364 "claimed": true, 00:19:29.365 "claim_type": "exclusive_write", 00:19:29.365 "zoned": false, 00:19:29.365 "supported_io_types": { 00:19:29.365 "read": true, 00:19:29.365 "write": true, 00:19:29.365 "unmap": true, 00:19:29.365 "flush": true, 00:19:29.365 "reset": true, 00:19:29.365 "nvme_admin": false, 00:19:29.365 "nvme_io": false, 00:19:29.365 "nvme_io_md": false, 00:19:29.365 "write_zeroes": true, 00:19:29.365 "zcopy": true, 00:19:29.365 "get_zone_info": false, 00:19:29.365 "zone_management": false, 00:19:29.365 "zone_append": false, 00:19:29.365 "compare": false, 00:19:29.365 "compare_and_write": false, 00:19:29.365 "abort": true, 00:19:29.365 "seek_hole": false, 00:19:29.365 "seek_data": false, 00:19:29.365 "copy": true, 00:19:29.365 "nvme_iov_md": false 00:19:29.365 }, 00:19:29.365 "memory_domains": [ 00:19:29.365 { 00:19:29.365 "dma_device_id": "system", 00:19:29.365 "dma_device_type": 1 00:19:29.365 }, 00:19:29.365 { 00:19:29.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.365 "dma_device_type": 2 00:19:29.365 } 00:19:29.365 ], 00:19:29.365 "driver_specific": {} 00:19:29.365 } 00:19:29.365 ] 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.365 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:29.624 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:29.624 "name": "Existed_Raid", 00:19:29.624 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:29.624 "strip_size_kb": 0, 00:19:29.624 "state": "configuring", 00:19:29.624 "raid_level": "raid1", 00:19:29.624 "superblock": true, 00:19:29.624 "num_base_bdevs": 4, 00:19:29.624 "num_base_bdevs_discovered": 3, 00:19:29.624 "num_base_bdevs_operational": 4, 00:19:29.624 "base_bdevs_list": [ 00:19:29.624 { 00:19:29.624 "name": "BaseBdev1", 00:19:29.624 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:29.624 "is_configured": true, 00:19:29.624 "data_offset": 2048, 00:19:29.624 "data_size": 63488 00:19:29.624 }, 00:19:29.624 { 00:19:29.624 "name": null, 00:19:29.624 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:29.624 "is_configured": false, 00:19:29.624 "data_offset": 2048, 00:19:29.624 "data_size": 63488 00:19:29.624 }, 00:19:29.624 { 00:19:29.624 "name": "BaseBdev3", 00:19:29.624 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:29.624 "is_configured": true, 00:19:29.624 "data_offset": 2048, 00:19:29.624 "data_size": 63488 00:19:29.624 }, 00:19:29.624 { 00:19:29.624 "name": "BaseBdev4", 00:19:29.624 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:29.624 "is_configured": true, 00:19:29.624 "data_offset": 2048, 00:19:29.624 "data_size": 63488 00:19:29.624 } 00:19:29.624 ] 00:19:29.624 }' 00:19:29.624 17:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:29.624 17:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:30.192 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.192 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:30.192 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:30.192 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:30.452 [2024-07-15 17:31:41.566056] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.452 17:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:31.017 17:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.017 "name": "Existed_Raid", 00:19:31.017 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:31.017 "strip_size_kb": 0, 00:19:31.017 "state": "configuring", 00:19:31.017 "raid_level": "raid1", 00:19:31.017 "superblock": true, 00:19:31.017 "num_base_bdevs": 4, 00:19:31.017 "num_base_bdevs_discovered": 2, 00:19:31.017 "num_base_bdevs_operational": 4, 00:19:31.017 "base_bdevs_list": [ 00:19:31.017 { 00:19:31.017 "name": "BaseBdev1", 00:19:31.017 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:31.017 "is_configured": true, 00:19:31.017 "data_offset": 2048, 00:19:31.017 "data_size": 63488 00:19:31.017 }, 00:19:31.017 { 00:19:31.017 "name": null, 00:19:31.017 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:31.017 "is_configured": false, 00:19:31.017 "data_offset": 2048, 00:19:31.017 "data_size": 63488 00:19:31.017 }, 00:19:31.017 { 00:19:31.017 "name": null, 00:19:31.017 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:31.017 "is_configured": false, 00:19:31.017 "data_offset": 2048, 00:19:31.017 "data_size": 63488 00:19:31.017 }, 00:19:31.017 { 00:19:31.017 "name": "BaseBdev4", 00:19:31.017 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:31.017 "is_configured": true, 00:19:31.017 "data_offset": 2048, 00:19:31.017 "data_size": 63488 00:19:31.017 } 00:19:31.017 ] 00:19:31.017 }' 00:19:31.017 17:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.017 17:31:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:31.583 17:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.583 17:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:31.583 17:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:31.583 17:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:31.843 [2024-07-15 17:31:43.049839] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.843 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:32.103 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.103 "name": "Existed_Raid", 00:19:32.103 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:32.103 "strip_size_kb": 0, 00:19:32.103 "state": "configuring", 00:19:32.103 "raid_level": "raid1", 00:19:32.103 "superblock": true, 00:19:32.103 "num_base_bdevs": 4, 00:19:32.103 "num_base_bdevs_discovered": 3, 00:19:32.103 "num_base_bdevs_operational": 4, 00:19:32.103 "base_bdevs_list": [ 00:19:32.103 { 00:19:32.103 "name": "BaseBdev1", 00:19:32.103 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:32.103 "is_configured": true, 00:19:32.103 "data_offset": 2048, 00:19:32.103 "data_size": 63488 00:19:32.103 }, 00:19:32.103 { 00:19:32.103 "name": null, 00:19:32.103 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:32.103 "is_configured": false, 00:19:32.103 "data_offset": 2048, 00:19:32.103 "data_size": 63488 00:19:32.103 }, 00:19:32.103 { 00:19:32.103 "name": "BaseBdev3", 00:19:32.103 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:32.103 "is_configured": true, 00:19:32.103 "data_offset": 2048, 00:19:32.103 "data_size": 63488 00:19:32.103 }, 00:19:32.103 { 00:19:32.103 "name": "BaseBdev4", 00:19:32.103 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:32.103 "is_configured": true, 00:19:32.103 "data_offset": 2048, 00:19:32.103 "data_size": 63488 00:19:32.103 } 00:19:32.103 ] 00:19:32.103 }' 00:19:32.103 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.103 17:31:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:32.673 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.673 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:32.933 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:32.933 17:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:32.933 [2024-07-15 17:31:44.152627] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.933 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:33.194 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.194 "name": "Existed_Raid", 00:19:33.194 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:33.194 "strip_size_kb": 0, 00:19:33.194 "state": "configuring", 00:19:33.194 "raid_level": "raid1", 00:19:33.194 "superblock": true, 00:19:33.194 "num_base_bdevs": 4, 00:19:33.194 "num_base_bdevs_discovered": 2, 00:19:33.194 "num_base_bdevs_operational": 4, 00:19:33.194 "base_bdevs_list": [ 00:19:33.194 { 00:19:33.194 "name": null, 00:19:33.194 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:33.194 "is_configured": false, 00:19:33.194 "data_offset": 2048, 00:19:33.194 "data_size": 63488 00:19:33.194 }, 00:19:33.194 { 00:19:33.194 "name": null, 00:19:33.194 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:33.194 "is_configured": false, 00:19:33.194 "data_offset": 2048, 00:19:33.194 "data_size": 63488 00:19:33.194 }, 00:19:33.194 { 00:19:33.194 "name": "BaseBdev3", 00:19:33.194 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:33.194 "is_configured": true, 00:19:33.194 "data_offset": 2048, 00:19:33.194 "data_size": 63488 00:19:33.194 }, 00:19:33.194 { 00:19:33.194 "name": "BaseBdev4", 00:19:33.194 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:33.194 "is_configured": true, 00:19:33.194 "data_offset": 2048, 00:19:33.194 "data_size": 63488 00:19:33.194 } 00:19:33.194 ] 00:19:33.194 }' 00:19:33.194 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.194 17:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:33.763 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.763 17:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:34.022 [2024-07-15 17:31:45.225164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.022 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.023 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:34.283 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.283 "name": "Existed_Raid", 00:19:34.283 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:34.283 "strip_size_kb": 0, 00:19:34.283 "state": "configuring", 00:19:34.283 "raid_level": "raid1", 00:19:34.283 "superblock": true, 00:19:34.283 "num_base_bdevs": 4, 00:19:34.283 "num_base_bdevs_discovered": 3, 00:19:34.283 "num_base_bdevs_operational": 4, 00:19:34.283 "base_bdevs_list": [ 00:19:34.283 { 00:19:34.283 "name": null, 00:19:34.283 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:34.283 "is_configured": false, 00:19:34.283 "data_offset": 2048, 00:19:34.283 "data_size": 63488 00:19:34.283 }, 00:19:34.283 { 00:19:34.283 "name": "BaseBdev2", 00:19:34.283 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:34.283 "is_configured": true, 00:19:34.283 "data_offset": 2048, 00:19:34.283 "data_size": 63488 00:19:34.283 }, 00:19:34.283 { 00:19:34.283 "name": "BaseBdev3", 00:19:34.283 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:34.283 "is_configured": true, 00:19:34.283 "data_offset": 2048, 00:19:34.283 "data_size": 63488 00:19:34.283 }, 00:19:34.283 { 00:19:34.283 "name": "BaseBdev4", 00:19:34.283 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:34.283 "is_configured": true, 00:19:34.283 "data_offset": 2048, 00:19:34.283 "data_size": 63488 00:19:34.283 } 00:19:34.283 ] 00:19:34.283 }' 00:19:34.283 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.283 17:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:34.851 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:34.851 17:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.851 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:34.851 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.851 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:35.111 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 019ccb61-3c76-4a0a-a537-a014f418c15c 00:19:35.372 [2024-07-15 17:31:46.497430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:35.372 [2024-07-15 17:31:46.497554] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10ea080 00:19:35.372 [2024-07-15 17:31:46.497562] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:35.372 [2024-07-15 17:31:46.497696] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10df0b0 00:19:35.372 [2024-07-15 17:31:46.497801] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10ea080 00:19:35.372 [2024-07-15 17:31:46.497808] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10ea080 00:19:35.372 [2024-07-15 17:31:46.497877] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:35.372 NewBaseBdev 00:19:35.372 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:35.372 17:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:35.372 17:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:35.372 17:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:35.372 17:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:35.372 17:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:35.372 17:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:35.633 [ 00:19:35.633 { 00:19:35.633 "name": "NewBaseBdev", 00:19:35.633 "aliases": [ 00:19:35.633 "019ccb61-3c76-4a0a-a537-a014f418c15c" 00:19:35.633 ], 00:19:35.633 "product_name": "Malloc disk", 00:19:35.633 "block_size": 512, 00:19:35.633 "num_blocks": 65536, 00:19:35.633 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:35.633 "assigned_rate_limits": { 00:19:35.633 "rw_ios_per_sec": 0, 00:19:35.633 "rw_mbytes_per_sec": 0, 00:19:35.633 "r_mbytes_per_sec": 0, 00:19:35.633 "w_mbytes_per_sec": 0 00:19:35.633 }, 00:19:35.633 "claimed": true, 00:19:35.633 "claim_type": "exclusive_write", 00:19:35.633 "zoned": false, 00:19:35.633 "supported_io_types": { 00:19:35.633 "read": true, 00:19:35.633 "write": true, 00:19:35.633 "unmap": true, 00:19:35.633 "flush": true, 00:19:35.633 "reset": true, 00:19:35.633 "nvme_admin": false, 00:19:35.633 "nvme_io": false, 00:19:35.633 "nvme_io_md": false, 00:19:35.633 "write_zeroes": true, 00:19:35.633 "zcopy": true, 00:19:35.633 "get_zone_info": false, 00:19:35.633 "zone_management": false, 00:19:35.633 "zone_append": false, 00:19:35.633 "compare": false, 00:19:35.633 "compare_and_write": false, 00:19:35.633 "abort": true, 00:19:35.633 "seek_hole": false, 00:19:35.633 "seek_data": false, 00:19:35.633 "copy": true, 00:19:35.633 "nvme_iov_md": false 00:19:35.633 }, 00:19:35.633 "memory_domains": [ 00:19:35.633 { 00:19:35.633 "dma_device_id": "system", 00:19:35.633 "dma_device_type": 1 00:19:35.633 }, 00:19:35.633 { 00:19:35.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.633 "dma_device_type": 2 00:19:35.633 } 00:19:35.633 ], 00:19:35.633 "driver_specific": {} 00:19:35.633 } 00:19:35.633 ] 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.633 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.634 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.634 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.634 17:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.894 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.894 "name": "Existed_Raid", 00:19:35.894 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:35.894 "strip_size_kb": 0, 00:19:35.894 "state": "online", 00:19:35.894 "raid_level": "raid1", 00:19:35.894 "superblock": true, 00:19:35.894 "num_base_bdevs": 4, 00:19:35.894 "num_base_bdevs_discovered": 4, 00:19:35.894 "num_base_bdevs_operational": 4, 00:19:35.894 "base_bdevs_list": [ 00:19:35.894 { 00:19:35.894 "name": "NewBaseBdev", 00:19:35.894 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:35.894 "is_configured": true, 00:19:35.894 "data_offset": 2048, 00:19:35.894 "data_size": 63488 00:19:35.894 }, 00:19:35.894 { 00:19:35.894 "name": "BaseBdev2", 00:19:35.894 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:35.894 "is_configured": true, 00:19:35.894 "data_offset": 2048, 00:19:35.894 "data_size": 63488 00:19:35.894 }, 00:19:35.894 { 00:19:35.894 "name": "BaseBdev3", 00:19:35.894 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:35.894 "is_configured": true, 00:19:35.894 "data_offset": 2048, 00:19:35.894 "data_size": 63488 00:19:35.894 }, 00:19:35.894 { 00:19:35.894 "name": "BaseBdev4", 00:19:35.894 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:35.894 "is_configured": true, 00:19:35.894 "data_offset": 2048, 00:19:35.894 "data_size": 63488 00:19:35.894 } 00:19:35.894 ] 00:19:35.894 }' 00:19:35.894 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.894 17:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:36.464 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:36.464 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:36.464 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:36.464 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:36.464 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:36.464 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:36.464 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:36.464 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:36.724 [2024-07-15 17:31:47.792962] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:36.724 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:36.724 "name": "Existed_Raid", 00:19:36.724 "aliases": [ 00:19:36.724 "bd65ba4a-b3d5-434b-aab6-c616e166eb6d" 00:19:36.724 ], 00:19:36.724 "product_name": "Raid Volume", 00:19:36.724 "block_size": 512, 00:19:36.724 "num_blocks": 63488, 00:19:36.724 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:36.724 "assigned_rate_limits": { 00:19:36.724 "rw_ios_per_sec": 0, 00:19:36.724 "rw_mbytes_per_sec": 0, 00:19:36.724 "r_mbytes_per_sec": 0, 00:19:36.724 "w_mbytes_per_sec": 0 00:19:36.724 }, 00:19:36.724 "claimed": false, 00:19:36.724 "zoned": false, 00:19:36.724 "supported_io_types": { 00:19:36.724 "read": true, 00:19:36.724 "write": true, 00:19:36.724 "unmap": false, 00:19:36.724 "flush": false, 00:19:36.724 "reset": true, 00:19:36.724 "nvme_admin": false, 00:19:36.724 "nvme_io": false, 00:19:36.724 "nvme_io_md": false, 00:19:36.724 "write_zeroes": true, 00:19:36.724 "zcopy": false, 00:19:36.724 "get_zone_info": false, 00:19:36.724 "zone_management": false, 00:19:36.724 "zone_append": false, 00:19:36.724 "compare": false, 00:19:36.724 "compare_and_write": false, 00:19:36.724 "abort": false, 00:19:36.724 "seek_hole": false, 00:19:36.724 "seek_data": false, 00:19:36.724 "copy": false, 00:19:36.724 "nvme_iov_md": false 00:19:36.724 }, 00:19:36.724 "memory_domains": [ 00:19:36.724 { 00:19:36.724 "dma_device_id": "system", 00:19:36.724 "dma_device_type": 1 00:19:36.724 }, 00:19:36.724 { 00:19:36.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.724 "dma_device_type": 2 00:19:36.724 }, 00:19:36.724 { 00:19:36.724 "dma_device_id": "system", 00:19:36.724 "dma_device_type": 1 00:19:36.724 }, 00:19:36.724 { 00:19:36.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.724 "dma_device_type": 2 00:19:36.724 }, 00:19:36.724 { 00:19:36.724 "dma_device_id": "system", 00:19:36.724 "dma_device_type": 1 00:19:36.724 }, 00:19:36.724 { 00:19:36.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.724 "dma_device_type": 2 00:19:36.724 }, 00:19:36.724 { 00:19:36.724 "dma_device_id": "system", 00:19:36.724 "dma_device_type": 1 00:19:36.724 }, 00:19:36.724 { 00:19:36.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.724 "dma_device_type": 2 00:19:36.724 } 00:19:36.724 ], 00:19:36.724 "driver_specific": { 00:19:36.724 "raid": { 00:19:36.724 "uuid": "bd65ba4a-b3d5-434b-aab6-c616e166eb6d", 00:19:36.724 "strip_size_kb": 0, 00:19:36.724 "state": "online", 00:19:36.725 "raid_level": "raid1", 00:19:36.725 "superblock": true, 00:19:36.725 "num_base_bdevs": 4, 00:19:36.725 "num_base_bdevs_discovered": 4, 00:19:36.725 "num_base_bdevs_operational": 4, 00:19:36.725 "base_bdevs_list": [ 00:19:36.725 { 00:19:36.725 "name": "NewBaseBdev", 00:19:36.725 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:36.725 "is_configured": true, 00:19:36.725 "data_offset": 2048, 00:19:36.725 "data_size": 63488 00:19:36.725 }, 00:19:36.725 { 00:19:36.725 "name": "BaseBdev2", 00:19:36.725 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:36.725 "is_configured": true, 00:19:36.725 "data_offset": 2048, 00:19:36.725 "data_size": 63488 00:19:36.725 }, 00:19:36.725 { 00:19:36.725 "name": "BaseBdev3", 00:19:36.725 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:36.725 "is_configured": true, 00:19:36.725 "data_offset": 2048, 00:19:36.725 "data_size": 63488 00:19:36.725 }, 00:19:36.725 { 00:19:36.725 "name": "BaseBdev4", 00:19:36.725 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:36.725 "is_configured": true, 00:19:36.725 "data_offset": 2048, 00:19:36.725 "data_size": 63488 00:19:36.725 } 00:19:36.725 ] 00:19:36.725 } 00:19:36.725 } 00:19:36.725 }' 00:19:36.725 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:36.725 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:36.725 BaseBdev2 00:19:36.725 BaseBdev3 00:19:36.725 BaseBdev4' 00:19:36.725 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:36.725 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:36.725 17:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:36.986 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:36.986 "name": "NewBaseBdev", 00:19:36.986 "aliases": [ 00:19:36.986 "019ccb61-3c76-4a0a-a537-a014f418c15c" 00:19:36.986 ], 00:19:36.986 "product_name": "Malloc disk", 00:19:36.986 "block_size": 512, 00:19:36.986 "num_blocks": 65536, 00:19:36.986 "uuid": "019ccb61-3c76-4a0a-a537-a014f418c15c", 00:19:36.986 "assigned_rate_limits": { 00:19:36.986 "rw_ios_per_sec": 0, 00:19:36.986 "rw_mbytes_per_sec": 0, 00:19:36.986 "r_mbytes_per_sec": 0, 00:19:36.986 "w_mbytes_per_sec": 0 00:19:36.986 }, 00:19:36.986 "claimed": true, 00:19:36.986 "claim_type": "exclusive_write", 00:19:36.986 "zoned": false, 00:19:36.986 "supported_io_types": { 00:19:36.986 "read": true, 00:19:36.986 "write": true, 00:19:36.986 "unmap": true, 00:19:36.986 "flush": true, 00:19:36.986 "reset": true, 00:19:36.986 "nvme_admin": false, 00:19:36.986 "nvme_io": false, 00:19:36.986 "nvme_io_md": false, 00:19:36.986 "write_zeroes": true, 00:19:36.986 "zcopy": true, 00:19:36.986 "get_zone_info": false, 00:19:36.986 "zone_management": false, 00:19:36.986 "zone_append": false, 00:19:36.986 "compare": false, 00:19:36.986 "compare_and_write": false, 00:19:36.986 "abort": true, 00:19:36.986 "seek_hole": false, 00:19:36.986 "seek_data": false, 00:19:36.986 "copy": true, 00:19:36.986 "nvme_iov_md": false 00:19:36.986 }, 00:19:36.986 "memory_domains": [ 00:19:36.986 { 00:19:36.986 "dma_device_id": "system", 00:19:36.986 "dma_device_type": 1 00:19:36.986 }, 00:19:36.986 { 00:19:36.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.986 "dma_device_type": 2 00:19:36.986 } 00:19:36.986 ], 00:19:36.986 "driver_specific": {} 00:19:36.986 }' 00:19:36.986 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.986 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.986 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:36.986 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.986 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.986 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:36.986 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.986 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.247 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:37.247 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.247 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.247 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:37.247 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:37.247 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:37.247 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:37.508 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:37.508 "name": "BaseBdev2", 00:19:37.508 "aliases": [ 00:19:37.508 "53050721-cc41-4fdd-917a-6215f8041ffd" 00:19:37.508 ], 00:19:37.508 "product_name": "Malloc disk", 00:19:37.508 "block_size": 512, 00:19:37.508 "num_blocks": 65536, 00:19:37.508 "uuid": "53050721-cc41-4fdd-917a-6215f8041ffd", 00:19:37.508 "assigned_rate_limits": { 00:19:37.508 "rw_ios_per_sec": 0, 00:19:37.508 "rw_mbytes_per_sec": 0, 00:19:37.508 "r_mbytes_per_sec": 0, 00:19:37.508 "w_mbytes_per_sec": 0 00:19:37.508 }, 00:19:37.508 "claimed": true, 00:19:37.508 "claim_type": "exclusive_write", 00:19:37.508 "zoned": false, 00:19:37.508 "supported_io_types": { 00:19:37.508 "read": true, 00:19:37.508 "write": true, 00:19:37.508 "unmap": true, 00:19:37.508 "flush": true, 00:19:37.508 "reset": true, 00:19:37.508 "nvme_admin": false, 00:19:37.508 "nvme_io": false, 00:19:37.508 "nvme_io_md": false, 00:19:37.508 "write_zeroes": true, 00:19:37.508 "zcopy": true, 00:19:37.508 "get_zone_info": false, 00:19:37.508 "zone_management": false, 00:19:37.508 "zone_append": false, 00:19:37.508 "compare": false, 00:19:37.508 "compare_and_write": false, 00:19:37.508 "abort": true, 00:19:37.508 "seek_hole": false, 00:19:37.508 "seek_data": false, 00:19:37.508 "copy": true, 00:19:37.508 "nvme_iov_md": false 00:19:37.508 }, 00:19:37.508 "memory_domains": [ 00:19:37.508 { 00:19:37.508 "dma_device_id": "system", 00:19:37.508 "dma_device_type": 1 00:19:37.508 }, 00:19:37.508 { 00:19:37.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.508 "dma_device_type": 2 00:19:37.508 } 00:19:37.508 ], 00:19:37.508 "driver_specific": {} 00:19:37.508 }' 00:19:37.508 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.508 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.508 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:37.508 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.508 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.508 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:37.508 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.768 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.768 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:37.768 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.768 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.768 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:37.768 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:37.769 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:37.769 17:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:38.029 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:38.029 "name": "BaseBdev3", 00:19:38.029 "aliases": [ 00:19:38.029 "0f15ee6a-94c1-4479-8798-0929f98bedf2" 00:19:38.029 ], 00:19:38.029 "product_name": "Malloc disk", 00:19:38.029 "block_size": 512, 00:19:38.029 "num_blocks": 65536, 00:19:38.029 "uuid": "0f15ee6a-94c1-4479-8798-0929f98bedf2", 00:19:38.029 "assigned_rate_limits": { 00:19:38.029 "rw_ios_per_sec": 0, 00:19:38.029 "rw_mbytes_per_sec": 0, 00:19:38.029 "r_mbytes_per_sec": 0, 00:19:38.029 "w_mbytes_per_sec": 0 00:19:38.029 }, 00:19:38.029 "claimed": true, 00:19:38.029 "claim_type": "exclusive_write", 00:19:38.029 "zoned": false, 00:19:38.029 "supported_io_types": { 00:19:38.029 "read": true, 00:19:38.029 "write": true, 00:19:38.029 "unmap": true, 00:19:38.029 "flush": true, 00:19:38.029 "reset": true, 00:19:38.029 "nvme_admin": false, 00:19:38.029 "nvme_io": false, 00:19:38.029 "nvme_io_md": false, 00:19:38.029 "write_zeroes": true, 00:19:38.029 "zcopy": true, 00:19:38.029 "get_zone_info": false, 00:19:38.029 "zone_management": false, 00:19:38.029 "zone_append": false, 00:19:38.029 "compare": false, 00:19:38.029 "compare_and_write": false, 00:19:38.029 "abort": true, 00:19:38.029 "seek_hole": false, 00:19:38.029 "seek_data": false, 00:19:38.029 "copy": true, 00:19:38.029 "nvme_iov_md": false 00:19:38.029 }, 00:19:38.029 "memory_domains": [ 00:19:38.029 { 00:19:38.029 "dma_device_id": "system", 00:19:38.029 "dma_device_type": 1 00:19:38.029 }, 00:19:38.029 { 00:19:38.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.029 "dma_device_type": 2 00:19:38.029 } 00:19:38.029 ], 00:19:38.029 "driver_specific": {} 00:19:38.029 }' 00:19:38.029 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.029 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.029 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.029 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.029 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.029 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.029 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.289 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.289 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:38.289 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.289 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.289 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:38.289 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:38.289 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:38.289 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:38.549 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:38.549 "name": "BaseBdev4", 00:19:38.549 "aliases": [ 00:19:38.549 "97fda5d0-da90-492a-a51a-5ab6c005825e" 00:19:38.549 ], 00:19:38.549 "product_name": "Malloc disk", 00:19:38.549 "block_size": 512, 00:19:38.549 "num_blocks": 65536, 00:19:38.549 "uuid": "97fda5d0-da90-492a-a51a-5ab6c005825e", 00:19:38.549 "assigned_rate_limits": { 00:19:38.549 "rw_ios_per_sec": 0, 00:19:38.549 "rw_mbytes_per_sec": 0, 00:19:38.549 "r_mbytes_per_sec": 0, 00:19:38.549 "w_mbytes_per_sec": 0 00:19:38.549 }, 00:19:38.549 "claimed": true, 00:19:38.549 "claim_type": "exclusive_write", 00:19:38.549 "zoned": false, 00:19:38.549 "supported_io_types": { 00:19:38.549 "read": true, 00:19:38.549 "write": true, 00:19:38.549 "unmap": true, 00:19:38.549 "flush": true, 00:19:38.549 "reset": true, 00:19:38.549 "nvme_admin": false, 00:19:38.549 "nvme_io": false, 00:19:38.549 "nvme_io_md": false, 00:19:38.549 "write_zeroes": true, 00:19:38.549 "zcopy": true, 00:19:38.549 "get_zone_info": false, 00:19:38.549 "zone_management": false, 00:19:38.549 "zone_append": false, 00:19:38.549 "compare": false, 00:19:38.549 "compare_and_write": false, 00:19:38.549 "abort": true, 00:19:38.549 "seek_hole": false, 00:19:38.549 "seek_data": false, 00:19:38.549 "copy": true, 00:19:38.549 "nvme_iov_md": false 00:19:38.550 }, 00:19:38.550 "memory_domains": [ 00:19:38.550 { 00:19:38.550 "dma_device_id": "system", 00:19:38.550 "dma_device_type": 1 00:19:38.550 }, 00:19:38.550 { 00:19:38.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.550 "dma_device_type": 2 00:19:38.550 } 00:19:38.550 ], 00:19:38.550 "driver_specific": {} 00:19:38.550 }' 00:19:38.550 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.550 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.550 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.550 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.550 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.810 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.810 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.810 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.810 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:38.810 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.810 17:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.810 17:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:38.810 17:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:39.070 [2024-07-15 17:31:50.191023] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:39.070 [2024-07-15 17:31:50.191046] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:39.070 [2024-07-15 17:31:50.191087] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:39.070 [2024-07-15 17:31:50.191296] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:39.070 [2024-07-15 17:31:50.191303] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10ea080 name Existed_Raid, state offline 00:19:39.070 17:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2843582 00:19:39.070 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2843582 ']' 00:19:39.070 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2843582 00:19:39.070 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:39.070 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:39.070 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2843582 00:19:39.070 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:39.070 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:39.070 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2843582' 00:19:39.070 killing process with pid 2843582 00:19:39.071 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2843582 00:19:39.071 [2024-07-15 17:31:50.273201] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:39.071 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2843582 00:19:39.071 [2024-07-15 17:31:50.293259] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:39.332 17:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:39.332 00:19:39.332 real 0m27.620s 00:19:39.332 user 0m51.832s 00:19:39.332 sys 0m4.004s 00:19:39.332 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:39.332 17:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:39.332 ************************************ 00:19:39.332 END TEST raid_state_function_test_sb 00:19:39.332 ************************************ 00:19:39.332 17:31:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:39.332 17:31:50 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:19:39.332 17:31:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:19:39.332 17:31:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:39.332 17:31:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:39.332 ************************************ 00:19:39.332 START TEST raid_superblock_test 00:19:39.332 ************************************ 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2848858 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2848858 /var/tmp/spdk-raid.sock 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2848858 ']' 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:39.332 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:39.332 17:31:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.332 [2024-07-15 17:31:50.547566] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:19:39.332 [2024-07-15 17:31:50.547617] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2848858 ] 00:19:39.593 [2024-07-15 17:31:50.636114] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:39.593 [2024-07-15 17:31:50.702415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:39.593 [2024-07-15 17:31:50.749769] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:39.593 [2024-07-15 17:31:50.749793] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:40.163 17:31:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:40.163 17:31:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:19:40.163 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:40.163 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:40.163 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:40.163 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:40.164 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:40.164 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:40.164 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:40.164 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:40.164 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:40.423 malloc1 00:19:40.423 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:40.684 [2024-07-15 17:31:51.752953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:40.684 [2024-07-15 17:31:51.752985] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:40.684 [2024-07-15 17:31:51.752997] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe91a20 00:19:40.684 [2024-07-15 17:31:51.753003] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:40.684 [2024-07-15 17:31:51.754301] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:40.684 [2024-07-15 17:31:51.754325] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:40.684 pt1 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:40.684 malloc2 00:19:40.684 17:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:40.945 [2024-07-15 17:31:52.136057] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:40.945 [2024-07-15 17:31:52.136088] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:40.945 [2024-07-15 17:31:52.136101] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe92040 00:19:40.945 [2024-07-15 17:31:52.136108] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:40.945 [2024-07-15 17:31:52.137313] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:40.945 [2024-07-15 17:31:52.137331] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:40.945 pt2 00:19:40.945 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:40.945 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:40.945 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:40.945 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:40.945 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:40.945 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:40.945 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:40.945 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:40.945 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:41.205 malloc3 00:19:41.205 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:41.466 [2024-07-15 17:31:52.519027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:41.466 [2024-07-15 17:31:52.519058] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.466 [2024-07-15 17:31:52.519069] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe92540 00:19:41.466 [2024-07-15 17:31:52.519075] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.466 [2024-07-15 17:31:52.520276] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.466 [2024-07-15 17:31:52.520294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:41.466 pt3 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:41.466 malloc4 00:19:41.466 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:41.726 [2024-07-15 17:31:52.873837] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:41.726 [2024-07-15 17:31:52.873870] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.726 [2024-07-15 17:31:52.873880] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103fd60 00:19:41.726 [2024-07-15 17:31:52.873886] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.726 [2024-07-15 17:31:52.875061] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.726 [2024-07-15 17:31:52.875079] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:41.726 pt4 00:19:41.726 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:41.726 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:41.726 17:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:41.987 [2024-07-15 17:31:53.062330] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:41.987 [2024-07-15 17:31:53.063354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:41.987 [2024-07-15 17:31:53.063397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:41.987 [2024-07-15 17:31:53.063430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:41.987 [2024-07-15 17:31:53.063566] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x103ce20 00:19:41.987 [2024-07-15 17:31:53.063574] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:41.987 [2024-07-15 17:31:53.063737] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe93000 00:19:41.987 [2024-07-15 17:31:53.063858] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x103ce20 00:19:41.987 [2024-07-15 17:31:53.063864] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x103ce20 00:19:41.987 [2024-07-15 17:31:53.063935] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:41.987 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:41.987 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:41.987 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:41.987 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:41.987 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:41.987 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:41.987 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.987 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.988 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.988 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.988 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.988 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:41.988 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.988 "name": "raid_bdev1", 00:19:41.988 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:41.988 "strip_size_kb": 0, 00:19:41.988 "state": "online", 00:19:41.988 "raid_level": "raid1", 00:19:41.988 "superblock": true, 00:19:41.988 "num_base_bdevs": 4, 00:19:41.988 "num_base_bdevs_discovered": 4, 00:19:41.988 "num_base_bdevs_operational": 4, 00:19:41.988 "base_bdevs_list": [ 00:19:41.988 { 00:19:41.988 "name": "pt1", 00:19:41.988 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:41.988 "is_configured": true, 00:19:41.988 "data_offset": 2048, 00:19:41.988 "data_size": 63488 00:19:41.988 }, 00:19:41.988 { 00:19:41.988 "name": "pt2", 00:19:41.988 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:41.988 "is_configured": true, 00:19:41.988 "data_offset": 2048, 00:19:41.988 "data_size": 63488 00:19:41.988 }, 00:19:41.988 { 00:19:41.988 "name": "pt3", 00:19:41.988 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:41.988 "is_configured": true, 00:19:41.988 "data_offset": 2048, 00:19:41.988 "data_size": 63488 00:19:41.988 }, 00:19:41.988 { 00:19:41.988 "name": "pt4", 00:19:41.988 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:41.988 "is_configured": true, 00:19:41.988 "data_offset": 2048, 00:19:41.988 "data_size": 63488 00:19:41.988 } 00:19:41.988 ] 00:19:41.988 }' 00:19:41.988 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.988 17:31:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.558 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:42.558 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:42.558 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:42.558 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:42.558 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:42.558 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:42.558 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:42.558 17:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:42.818 [2024-07-15 17:31:54.000943] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:42.818 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:42.818 "name": "raid_bdev1", 00:19:42.818 "aliases": [ 00:19:42.818 "7329e026-e7bf-4455-ae90-6ef2a65a1979" 00:19:42.818 ], 00:19:42.818 "product_name": "Raid Volume", 00:19:42.818 "block_size": 512, 00:19:42.818 "num_blocks": 63488, 00:19:42.818 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:42.818 "assigned_rate_limits": { 00:19:42.818 "rw_ios_per_sec": 0, 00:19:42.818 "rw_mbytes_per_sec": 0, 00:19:42.818 "r_mbytes_per_sec": 0, 00:19:42.818 "w_mbytes_per_sec": 0 00:19:42.818 }, 00:19:42.818 "claimed": false, 00:19:42.818 "zoned": false, 00:19:42.818 "supported_io_types": { 00:19:42.818 "read": true, 00:19:42.818 "write": true, 00:19:42.818 "unmap": false, 00:19:42.818 "flush": false, 00:19:42.818 "reset": true, 00:19:42.818 "nvme_admin": false, 00:19:42.818 "nvme_io": false, 00:19:42.818 "nvme_io_md": false, 00:19:42.819 "write_zeroes": true, 00:19:42.819 "zcopy": false, 00:19:42.819 "get_zone_info": false, 00:19:42.819 "zone_management": false, 00:19:42.819 "zone_append": false, 00:19:42.819 "compare": false, 00:19:42.819 "compare_and_write": false, 00:19:42.819 "abort": false, 00:19:42.819 "seek_hole": false, 00:19:42.819 "seek_data": false, 00:19:42.819 "copy": false, 00:19:42.819 "nvme_iov_md": false 00:19:42.819 }, 00:19:42.819 "memory_domains": [ 00:19:42.819 { 00:19:42.819 "dma_device_id": "system", 00:19:42.819 "dma_device_type": 1 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.819 "dma_device_type": 2 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "dma_device_id": "system", 00:19:42.819 "dma_device_type": 1 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.819 "dma_device_type": 2 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "dma_device_id": "system", 00:19:42.819 "dma_device_type": 1 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.819 "dma_device_type": 2 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "dma_device_id": "system", 00:19:42.819 "dma_device_type": 1 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.819 "dma_device_type": 2 00:19:42.819 } 00:19:42.819 ], 00:19:42.819 "driver_specific": { 00:19:42.819 "raid": { 00:19:42.819 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:42.819 "strip_size_kb": 0, 00:19:42.819 "state": "online", 00:19:42.819 "raid_level": "raid1", 00:19:42.819 "superblock": true, 00:19:42.819 "num_base_bdevs": 4, 00:19:42.819 "num_base_bdevs_discovered": 4, 00:19:42.819 "num_base_bdevs_operational": 4, 00:19:42.819 "base_bdevs_list": [ 00:19:42.819 { 00:19:42.819 "name": "pt1", 00:19:42.819 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:42.819 "is_configured": true, 00:19:42.819 "data_offset": 2048, 00:19:42.819 "data_size": 63488 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "name": "pt2", 00:19:42.819 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:42.819 "is_configured": true, 00:19:42.819 "data_offset": 2048, 00:19:42.819 "data_size": 63488 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "name": "pt3", 00:19:42.819 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:42.819 "is_configured": true, 00:19:42.819 "data_offset": 2048, 00:19:42.819 "data_size": 63488 00:19:42.819 }, 00:19:42.819 { 00:19:42.819 "name": "pt4", 00:19:42.819 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:42.819 "is_configured": true, 00:19:42.819 "data_offset": 2048, 00:19:42.819 "data_size": 63488 00:19:42.819 } 00:19:42.819 ] 00:19:42.819 } 00:19:42.819 } 00:19:42.819 }' 00:19:42.819 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:42.819 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:42.819 pt2 00:19:42.819 pt3 00:19:42.819 pt4' 00:19:42.819 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.819 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.819 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:43.080 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.080 "name": "pt1", 00:19:43.080 "aliases": [ 00:19:43.080 "00000000-0000-0000-0000-000000000001" 00:19:43.080 ], 00:19:43.080 "product_name": "passthru", 00:19:43.080 "block_size": 512, 00:19:43.080 "num_blocks": 65536, 00:19:43.080 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:43.080 "assigned_rate_limits": { 00:19:43.080 "rw_ios_per_sec": 0, 00:19:43.080 "rw_mbytes_per_sec": 0, 00:19:43.080 "r_mbytes_per_sec": 0, 00:19:43.080 "w_mbytes_per_sec": 0 00:19:43.080 }, 00:19:43.080 "claimed": true, 00:19:43.080 "claim_type": "exclusive_write", 00:19:43.080 "zoned": false, 00:19:43.080 "supported_io_types": { 00:19:43.080 "read": true, 00:19:43.080 "write": true, 00:19:43.080 "unmap": true, 00:19:43.080 "flush": true, 00:19:43.080 "reset": true, 00:19:43.080 "nvme_admin": false, 00:19:43.080 "nvme_io": false, 00:19:43.080 "nvme_io_md": false, 00:19:43.080 "write_zeroes": true, 00:19:43.080 "zcopy": true, 00:19:43.080 "get_zone_info": false, 00:19:43.080 "zone_management": false, 00:19:43.080 "zone_append": false, 00:19:43.080 "compare": false, 00:19:43.080 "compare_and_write": false, 00:19:43.080 "abort": true, 00:19:43.080 "seek_hole": false, 00:19:43.080 "seek_data": false, 00:19:43.080 "copy": true, 00:19:43.080 "nvme_iov_md": false 00:19:43.080 }, 00:19:43.080 "memory_domains": [ 00:19:43.080 { 00:19:43.080 "dma_device_id": "system", 00:19:43.080 "dma_device_type": 1 00:19:43.080 }, 00:19:43.080 { 00:19:43.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.080 "dma_device_type": 2 00:19:43.080 } 00:19:43.080 ], 00:19:43.080 "driver_specific": { 00:19:43.080 "passthru": { 00:19:43.080 "name": "pt1", 00:19:43.080 "base_bdev_name": "malloc1" 00:19:43.080 } 00:19:43.080 } 00:19:43.080 }' 00:19:43.080 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.080 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.080 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.080 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:43.395 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.655 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.655 "name": "pt2", 00:19:43.655 "aliases": [ 00:19:43.655 "00000000-0000-0000-0000-000000000002" 00:19:43.655 ], 00:19:43.655 "product_name": "passthru", 00:19:43.655 "block_size": 512, 00:19:43.655 "num_blocks": 65536, 00:19:43.655 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:43.655 "assigned_rate_limits": { 00:19:43.655 "rw_ios_per_sec": 0, 00:19:43.655 "rw_mbytes_per_sec": 0, 00:19:43.655 "r_mbytes_per_sec": 0, 00:19:43.655 "w_mbytes_per_sec": 0 00:19:43.655 }, 00:19:43.655 "claimed": true, 00:19:43.655 "claim_type": "exclusive_write", 00:19:43.655 "zoned": false, 00:19:43.655 "supported_io_types": { 00:19:43.655 "read": true, 00:19:43.655 "write": true, 00:19:43.655 "unmap": true, 00:19:43.655 "flush": true, 00:19:43.655 "reset": true, 00:19:43.655 "nvme_admin": false, 00:19:43.655 "nvme_io": false, 00:19:43.655 "nvme_io_md": false, 00:19:43.655 "write_zeroes": true, 00:19:43.655 "zcopy": true, 00:19:43.655 "get_zone_info": false, 00:19:43.655 "zone_management": false, 00:19:43.655 "zone_append": false, 00:19:43.655 "compare": false, 00:19:43.655 "compare_and_write": false, 00:19:43.655 "abort": true, 00:19:43.655 "seek_hole": false, 00:19:43.655 "seek_data": false, 00:19:43.655 "copy": true, 00:19:43.655 "nvme_iov_md": false 00:19:43.655 }, 00:19:43.655 "memory_domains": [ 00:19:43.655 { 00:19:43.655 "dma_device_id": "system", 00:19:43.655 "dma_device_type": 1 00:19:43.655 }, 00:19:43.655 { 00:19:43.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.655 "dma_device_type": 2 00:19:43.655 } 00:19:43.655 ], 00:19:43.655 "driver_specific": { 00:19:43.655 "passthru": { 00:19:43.655 "name": "pt2", 00:19:43.655 "base_bdev_name": "malloc2" 00:19:43.655 } 00:19:43.655 } 00:19:43.655 }' 00:19:43.655 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.655 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.655 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.655 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.915 17:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:43.915 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:44.176 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:44.176 "name": "pt3", 00:19:44.176 "aliases": [ 00:19:44.176 "00000000-0000-0000-0000-000000000003" 00:19:44.176 ], 00:19:44.176 "product_name": "passthru", 00:19:44.176 "block_size": 512, 00:19:44.176 "num_blocks": 65536, 00:19:44.176 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:44.176 "assigned_rate_limits": { 00:19:44.176 "rw_ios_per_sec": 0, 00:19:44.176 "rw_mbytes_per_sec": 0, 00:19:44.176 "r_mbytes_per_sec": 0, 00:19:44.176 "w_mbytes_per_sec": 0 00:19:44.176 }, 00:19:44.176 "claimed": true, 00:19:44.176 "claim_type": "exclusive_write", 00:19:44.176 "zoned": false, 00:19:44.176 "supported_io_types": { 00:19:44.176 "read": true, 00:19:44.176 "write": true, 00:19:44.176 "unmap": true, 00:19:44.176 "flush": true, 00:19:44.176 "reset": true, 00:19:44.176 "nvme_admin": false, 00:19:44.176 "nvme_io": false, 00:19:44.176 "nvme_io_md": false, 00:19:44.176 "write_zeroes": true, 00:19:44.176 "zcopy": true, 00:19:44.176 "get_zone_info": false, 00:19:44.176 "zone_management": false, 00:19:44.176 "zone_append": false, 00:19:44.176 "compare": false, 00:19:44.176 "compare_and_write": false, 00:19:44.176 "abort": true, 00:19:44.176 "seek_hole": false, 00:19:44.176 "seek_data": false, 00:19:44.176 "copy": true, 00:19:44.176 "nvme_iov_md": false 00:19:44.176 }, 00:19:44.176 "memory_domains": [ 00:19:44.176 { 00:19:44.176 "dma_device_id": "system", 00:19:44.176 "dma_device_type": 1 00:19:44.176 }, 00:19:44.176 { 00:19:44.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.176 "dma_device_type": 2 00:19:44.176 } 00:19:44.176 ], 00:19:44.176 "driver_specific": { 00:19:44.176 "passthru": { 00:19:44.176 "name": "pt3", 00:19:44.176 "base_bdev_name": "malloc3" 00:19:44.176 } 00:19:44.176 } 00:19:44.176 }' 00:19:44.176 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.176 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.437 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.437 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.437 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.437 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.437 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.437 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.437 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.437 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.438 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.698 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.698 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:44.698 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:44.698 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:44.698 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:44.698 "name": "pt4", 00:19:44.698 "aliases": [ 00:19:44.698 "00000000-0000-0000-0000-000000000004" 00:19:44.698 ], 00:19:44.698 "product_name": "passthru", 00:19:44.698 "block_size": 512, 00:19:44.698 "num_blocks": 65536, 00:19:44.698 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:44.698 "assigned_rate_limits": { 00:19:44.698 "rw_ios_per_sec": 0, 00:19:44.698 "rw_mbytes_per_sec": 0, 00:19:44.698 "r_mbytes_per_sec": 0, 00:19:44.698 "w_mbytes_per_sec": 0 00:19:44.698 }, 00:19:44.698 "claimed": true, 00:19:44.698 "claim_type": "exclusive_write", 00:19:44.698 "zoned": false, 00:19:44.698 "supported_io_types": { 00:19:44.698 "read": true, 00:19:44.698 "write": true, 00:19:44.698 "unmap": true, 00:19:44.698 "flush": true, 00:19:44.698 "reset": true, 00:19:44.698 "nvme_admin": false, 00:19:44.698 "nvme_io": false, 00:19:44.698 "nvme_io_md": false, 00:19:44.698 "write_zeroes": true, 00:19:44.698 "zcopy": true, 00:19:44.698 "get_zone_info": false, 00:19:44.698 "zone_management": false, 00:19:44.698 "zone_append": false, 00:19:44.698 "compare": false, 00:19:44.698 "compare_and_write": false, 00:19:44.698 "abort": true, 00:19:44.698 "seek_hole": false, 00:19:44.698 "seek_data": false, 00:19:44.698 "copy": true, 00:19:44.698 "nvme_iov_md": false 00:19:44.698 }, 00:19:44.698 "memory_domains": [ 00:19:44.698 { 00:19:44.698 "dma_device_id": "system", 00:19:44.698 "dma_device_type": 1 00:19:44.698 }, 00:19:44.698 { 00:19:44.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.698 "dma_device_type": 2 00:19:44.698 } 00:19:44.698 ], 00:19:44.698 "driver_specific": { 00:19:44.698 "passthru": { 00:19:44.698 "name": "pt4", 00:19:44.698 "base_bdev_name": "malloc4" 00:19:44.698 } 00:19:44.698 } 00:19:44.698 }' 00:19:44.698 17:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.958 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.958 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.958 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.958 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.958 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.958 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.958 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.958 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.958 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:45.219 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:45.219 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:45.219 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:45.219 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:45.219 [2024-07-15 17:31:56.483215] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:45.479 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7329e026-e7bf-4455-ae90-6ef2a65a1979 00:19:45.479 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7329e026-e7bf-4455-ae90-6ef2a65a1979 ']' 00:19:45.479 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:45.479 [2024-07-15 17:31:56.695499] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:45.479 [2024-07-15 17:31:56.695514] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:45.479 [2024-07-15 17:31:56.695552] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:45.479 [2024-07-15 17:31:56.695611] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:45.479 [2024-07-15 17:31:56.695617] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x103ce20 name raid_bdev1, state offline 00:19:45.479 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.479 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:45.739 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:45.739 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:45.739 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:45.739 17:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:45.998 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:45.998 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:46.257 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:46.257 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:46.257 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:46.257 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:46.516 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:46.516 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:46.775 17:31:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:46.775 [2024-07-15 17:31:58.070937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:46.775 [2024-07-15 17:31:58.072008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:46.776 [2024-07-15 17:31:58.072043] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:46.776 [2024-07-15 17:31:58.072068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:46.776 [2024-07-15 17:31:58.072101] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:46.776 [2024-07-15 17:31:58.072130] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:46.776 [2024-07-15 17:31:58.072143] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:46.776 [2024-07-15 17:31:58.072157] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:46.776 [2024-07-15 17:31:58.072167] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:46.776 [2024-07-15 17:31:58.072172] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe90ee0 name raid_bdev1, state configuring 00:19:47.036 request: 00:19:47.036 { 00:19:47.036 "name": "raid_bdev1", 00:19:47.036 "raid_level": "raid1", 00:19:47.036 "base_bdevs": [ 00:19:47.036 "malloc1", 00:19:47.036 "malloc2", 00:19:47.036 "malloc3", 00:19:47.036 "malloc4" 00:19:47.036 ], 00:19:47.036 "superblock": false, 00:19:47.036 "method": "bdev_raid_create", 00:19:47.036 "req_id": 1 00:19:47.036 } 00:19:47.036 Got JSON-RPC error response 00:19:47.036 response: 00:19:47.036 { 00:19:47.036 "code": -17, 00:19:47.036 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:47.036 } 00:19:47.036 17:31:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:47.036 17:31:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:47.036 17:31:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:47.036 17:31:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:47.036 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.036 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:47.036 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:47.036 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:47.036 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:47.295 [2024-07-15 17:31:58.439824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:47.295 [2024-07-15 17:31:58.439857] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:47.295 [2024-07-15 17:31:58.439868] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1040a20 00:19:47.295 [2024-07-15 17:31:58.439874] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:47.295 [2024-07-15 17:31:58.441155] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:47.295 [2024-07-15 17:31:58.441174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:47.295 [2024-07-15 17:31:58.441219] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:47.295 [2024-07-15 17:31:58.441238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:47.295 pt1 00:19:47.295 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:47.295 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.296 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:47.555 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.555 "name": "raid_bdev1", 00:19:47.555 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:47.555 "strip_size_kb": 0, 00:19:47.555 "state": "configuring", 00:19:47.555 "raid_level": "raid1", 00:19:47.555 "superblock": true, 00:19:47.555 "num_base_bdevs": 4, 00:19:47.555 "num_base_bdevs_discovered": 1, 00:19:47.555 "num_base_bdevs_operational": 4, 00:19:47.555 "base_bdevs_list": [ 00:19:47.555 { 00:19:47.555 "name": "pt1", 00:19:47.555 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:47.555 "is_configured": true, 00:19:47.555 "data_offset": 2048, 00:19:47.555 "data_size": 63488 00:19:47.555 }, 00:19:47.555 { 00:19:47.555 "name": null, 00:19:47.555 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:47.555 "is_configured": false, 00:19:47.555 "data_offset": 2048, 00:19:47.555 "data_size": 63488 00:19:47.555 }, 00:19:47.555 { 00:19:47.555 "name": null, 00:19:47.555 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:47.555 "is_configured": false, 00:19:47.555 "data_offset": 2048, 00:19:47.555 "data_size": 63488 00:19:47.555 }, 00:19:47.555 { 00:19:47.555 "name": null, 00:19:47.555 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:47.555 "is_configured": false, 00:19:47.555 "data_offset": 2048, 00:19:47.555 "data_size": 63488 00:19:47.555 } 00:19:47.555 ] 00:19:47.555 }' 00:19:47.555 17:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.555 17:31:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:48.130 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:48.130 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:48.130 [2024-07-15 17:31:59.354154] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:48.130 [2024-07-15 17:31:59.354192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:48.130 [2024-07-15 17:31:59.354206] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe91160 00:19:48.130 [2024-07-15 17:31:59.354214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:48.130 [2024-07-15 17:31:59.354483] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:48.130 [2024-07-15 17:31:59.354498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:48.130 [2024-07-15 17:31:59.354542] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:48.130 [2024-07-15 17:31:59.354556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:48.130 pt2 00:19:48.130 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:48.389 [2024-07-15 17:31:59.542633] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.389 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:48.649 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.649 "name": "raid_bdev1", 00:19:48.649 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:48.649 "strip_size_kb": 0, 00:19:48.649 "state": "configuring", 00:19:48.649 "raid_level": "raid1", 00:19:48.649 "superblock": true, 00:19:48.649 "num_base_bdevs": 4, 00:19:48.649 "num_base_bdevs_discovered": 1, 00:19:48.649 "num_base_bdevs_operational": 4, 00:19:48.649 "base_bdevs_list": [ 00:19:48.649 { 00:19:48.649 "name": "pt1", 00:19:48.649 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:48.649 "is_configured": true, 00:19:48.649 "data_offset": 2048, 00:19:48.649 "data_size": 63488 00:19:48.649 }, 00:19:48.649 { 00:19:48.649 "name": null, 00:19:48.649 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:48.649 "is_configured": false, 00:19:48.649 "data_offset": 2048, 00:19:48.649 "data_size": 63488 00:19:48.649 }, 00:19:48.649 { 00:19:48.649 "name": null, 00:19:48.649 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:48.649 "is_configured": false, 00:19:48.649 "data_offset": 2048, 00:19:48.649 "data_size": 63488 00:19:48.649 }, 00:19:48.649 { 00:19:48.649 "name": null, 00:19:48.649 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:48.649 "is_configured": false, 00:19:48.649 "data_offset": 2048, 00:19:48.649 "data_size": 63488 00:19:48.649 } 00:19:48.649 ] 00:19:48.649 }' 00:19:48.649 17:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.649 17:31:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.219 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:49.219 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:49.219 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:49.219 [2024-07-15 17:32:00.481011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:49.219 [2024-07-15 17:32:00.481047] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.219 [2024-07-15 17:32:00.481059] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe91df0 00:19:49.219 [2024-07-15 17:32:00.481065] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.219 [2024-07-15 17:32:00.481328] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.219 [2024-07-15 17:32:00.481343] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:49.219 [2024-07-15 17:32:00.481386] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:49.219 [2024-07-15 17:32:00.481399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:49.219 pt2 00:19:49.220 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:49.220 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:49.220 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:49.481 [2024-07-15 17:32:00.673495] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:49.481 [2024-07-15 17:32:00.673516] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.481 [2024-07-15 17:32:00.673526] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103ff90 00:19:49.481 [2024-07-15 17:32:00.673532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.481 [2024-07-15 17:32:00.673768] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.481 [2024-07-15 17:32:00.673778] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:49.481 [2024-07-15 17:32:00.673813] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:49.481 [2024-07-15 17:32:00.673824] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:49.481 pt3 00:19:49.481 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:49.481 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:49.481 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:49.742 [2024-07-15 17:32:00.861974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:49.742 [2024-07-15 17:32:00.861991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.742 [2024-07-15 17:32:00.862001] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe90b30 00:19:49.742 [2024-07-15 17:32:00.862007] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.742 [2024-07-15 17:32:00.862218] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.742 [2024-07-15 17:32:00.862228] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:49.742 [2024-07-15 17:32:00.862261] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:49.742 [2024-07-15 17:32:00.862272] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:49.742 [2024-07-15 17:32:00.862365] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x103baa0 00:19:49.742 [2024-07-15 17:32:00.862370] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:49.742 [2024-07-15 17:32:00.862509] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x103d260 00:19:49.742 [2024-07-15 17:32:00.862615] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x103baa0 00:19:49.742 [2024-07-15 17:32:00.862620] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x103baa0 00:19:49.742 [2024-07-15 17:32:00.862693] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:49.742 pt4 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.742 17:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.003 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.003 "name": "raid_bdev1", 00:19:50.003 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:50.003 "strip_size_kb": 0, 00:19:50.003 "state": "online", 00:19:50.003 "raid_level": "raid1", 00:19:50.003 "superblock": true, 00:19:50.003 "num_base_bdevs": 4, 00:19:50.003 "num_base_bdevs_discovered": 4, 00:19:50.003 "num_base_bdevs_operational": 4, 00:19:50.003 "base_bdevs_list": [ 00:19:50.003 { 00:19:50.003 "name": "pt1", 00:19:50.003 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:50.003 "is_configured": true, 00:19:50.003 "data_offset": 2048, 00:19:50.003 "data_size": 63488 00:19:50.003 }, 00:19:50.003 { 00:19:50.003 "name": "pt2", 00:19:50.003 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:50.003 "is_configured": true, 00:19:50.003 "data_offset": 2048, 00:19:50.003 "data_size": 63488 00:19:50.003 }, 00:19:50.003 { 00:19:50.003 "name": "pt3", 00:19:50.003 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:50.003 "is_configured": true, 00:19:50.003 "data_offset": 2048, 00:19:50.003 "data_size": 63488 00:19:50.003 }, 00:19:50.003 { 00:19:50.003 "name": "pt4", 00:19:50.003 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:50.003 "is_configured": true, 00:19:50.003 "data_offset": 2048, 00:19:50.003 "data_size": 63488 00:19:50.003 } 00:19:50.003 ] 00:19:50.003 }' 00:19:50.003 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.003 17:32:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.572 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:50.572 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:50.572 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:50.572 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:50.572 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:50.572 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:50.572 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:50.572 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:50.572 [2024-07-15 17:32:01.788653] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:50.572 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:50.572 "name": "raid_bdev1", 00:19:50.572 "aliases": [ 00:19:50.572 "7329e026-e7bf-4455-ae90-6ef2a65a1979" 00:19:50.572 ], 00:19:50.572 "product_name": "Raid Volume", 00:19:50.572 "block_size": 512, 00:19:50.572 "num_blocks": 63488, 00:19:50.572 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:50.572 "assigned_rate_limits": { 00:19:50.572 "rw_ios_per_sec": 0, 00:19:50.572 "rw_mbytes_per_sec": 0, 00:19:50.572 "r_mbytes_per_sec": 0, 00:19:50.572 "w_mbytes_per_sec": 0 00:19:50.572 }, 00:19:50.573 "claimed": false, 00:19:50.573 "zoned": false, 00:19:50.573 "supported_io_types": { 00:19:50.573 "read": true, 00:19:50.573 "write": true, 00:19:50.573 "unmap": false, 00:19:50.573 "flush": false, 00:19:50.573 "reset": true, 00:19:50.573 "nvme_admin": false, 00:19:50.573 "nvme_io": false, 00:19:50.573 "nvme_io_md": false, 00:19:50.573 "write_zeroes": true, 00:19:50.573 "zcopy": false, 00:19:50.573 "get_zone_info": false, 00:19:50.573 "zone_management": false, 00:19:50.573 "zone_append": false, 00:19:50.573 "compare": false, 00:19:50.573 "compare_and_write": false, 00:19:50.573 "abort": false, 00:19:50.573 "seek_hole": false, 00:19:50.573 "seek_data": false, 00:19:50.573 "copy": false, 00:19:50.573 "nvme_iov_md": false 00:19:50.573 }, 00:19:50.573 "memory_domains": [ 00:19:50.573 { 00:19:50.573 "dma_device_id": "system", 00:19:50.573 "dma_device_type": 1 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.573 "dma_device_type": 2 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "dma_device_id": "system", 00:19:50.573 "dma_device_type": 1 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.573 "dma_device_type": 2 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "dma_device_id": "system", 00:19:50.573 "dma_device_type": 1 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.573 "dma_device_type": 2 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "dma_device_id": "system", 00:19:50.573 "dma_device_type": 1 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.573 "dma_device_type": 2 00:19:50.573 } 00:19:50.573 ], 00:19:50.573 "driver_specific": { 00:19:50.573 "raid": { 00:19:50.573 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:50.573 "strip_size_kb": 0, 00:19:50.573 "state": "online", 00:19:50.573 "raid_level": "raid1", 00:19:50.573 "superblock": true, 00:19:50.573 "num_base_bdevs": 4, 00:19:50.573 "num_base_bdevs_discovered": 4, 00:19:50.573 "num_base_bdevs_operational": 4, 00:19:50.573 "base_bdevs_list": [ 00:19:50.573 { 00:19:50.573 "name": "pt1", 00:19:50.573 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:50.573 "is_configured": true, 00:19:50.573 "data_offset": 2048, 00:19:50.573 "data_size": 63488 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "name": "pt2", 00:19:50.573 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:50.573 "is_configured": true, 00:19:50.573 "data_offset": 2048, 00:19:50.573 "data_size": 63488 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "name": "pt3", 00:19:50.573 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:50.573 "is_configured": true, 00:19:50.573 "data_offset": 2048, 00:19:50.573 "data_size": 63488 00:19:50.573 }, 00:19:50.573 { 00:19:50.573 "name": "pt4", 00:19:50.573 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:50.573 "is_configured": true, 00:19:50.573 "data_offset": 2048, 00:19:50.573 "data_size": 63488 00:19:50.573 } 00:19:50.573 ] 00:19:50.573 } 00:19:50.573 } 00:19:50.573 }' 00:19:50.573 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:50.573 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:50.573 pt2 00:19:50.573 pt3 00:19:50.573 pt4' 00:19:50.573 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:50.573 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:50.573 17:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:50.833 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:50.833 "name": "pt1", 00:19:50.833 "aliases": [ 00:19:50.833 "00000000-0000-0000-0000-000000000001" 00:19:50.833 ], 00:19:50.833 "product_name": "passthru", 00:19:50.833 "block_size": 512, 00:19:50.833 "num_blocks": 65536, 00:19:50.833 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:50.833 "assigned_rate_limits": { 00:19:50.833 "rw_ios_per_sec": 0, 00:19:50.833 "rw_mbytes_per_sec": 0, 00:19:50.833 "r_mbytes_per_sec": 0, 00:19:50.833 "w_mbytes_per_sec": 0 00:19:50.833 }, 00:19:50.833 "claimed": true, 00:19:50.833 "claim_type": "exclusive_write", 00:19:50.833 "zoned": false, 00:19:50.833 "supported_io_types": { 00:19:50.833 "read": true, 00:19:50.833 "write": true, 00:19:50.833 "unmap": true, 00:19:50.833 "flush": true, 00:19:50.833 "reset": true, 00:19:50.833 "nvme_admin": false, 00:19:50.833 "nvme_io": false, 00:19:50.833 "nvme_io_md": false, 00:19:50.833 "write_zeroes": true, 00:19:50.833 "zcopy": true, 00:19:50.833 "get_zone_info": false, 00:19:50.833 "zone_management": false, 00:19:50.833 "zone_append": false, 00:19:50.833 "compare": false, 00:19:50.833 "compare_and_write": false, 00:19:50.833 "abort": true, 00:19:50.833 "seek_hole": false, 00:19:50.833 "seek_data": false, 00:19:50.833 "copy": true, 00:19:50.833 "nvme_iov_md": false 00:19:50.833 }, 00:19:50.833 "memory_domains": [ 00:19:50.833 { 00:19:50.833 "dma_device_id": "system", 00:19:50.833 "dma_device_type": 1 00:19:50.833 }, 00:19:50.833 { 00:19:50.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.833 "dma_device_type": 2 00:19:50.833 } 00:19:50.833 ], 00:19:50.833 "driver_specific": { 00:19:50.833 "passthru": { 00:19:50.833 "name": "pt1", 00:19:50.834 "base_bdev_name": "malloc1" 00:19:50.834 } 00:19:50.834 } 00:19:50.834 }' 00:19:50.834 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:50.834 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:50.834 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:50.834 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:51.094 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:51.354 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:51.354 "name": "pt2", 00:19:51.354 "aliases": [ 00:19:51.354 "00000000-0000-0000-0000-000000000002" 00:19:51.354 ], 00:19:51.355 "product_name": "passthru", 00:19:51.355 "block_size": 512, 00:19:51.355 "num_blocks": 65536, 00:19:51.355 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:51.355 "assigned_rate_limits": { 00:19:51.355 "rw_ios_per_sec": 0, 00:19:51.355 "rw_mbytes_per_sec": 0, 00:19:51.355 "r_mbytes_per_sec": 0, 00:19:51.355 "w_mbytes_per_sec": 0 00:19:51.355 }, 00:19:51.355 "claimed": true, 00:19:51.355 "claim_type": "exclusive_write", 00:19:51.355 "zoned": false, 00:19:51.355 "supported_io_types": { 00:19:51.355 "read": true, 00:19:51.355 "write": true, 00:19:51.355 "unmap": true, 00:19:51.355 "flush": true, 00:19:51.355 "reset": true, 00:19:51.355 "nvme_admin": false, 00:19:51.355 "nvme_io": false, 00:19:51.355 "nvme_io_md": false, 00:19:51.355 "write_zeroes": true, 00:19:51.355 "zcopy": true, 00:19:51.355 "get_zone_info": false, 00:19:51.355 "zone_management": false, 00:19:51.355 "zone_append": false, 00:19:51.355 "compare": false, 00:19:51.355 "compare_and_write": false, 00:19:51.355 "abort": true, 00:19:51.355 "seek_hole": false, 00:19:51.355 "seek_data": false, 00:19:51.355 "copy": true, 00:19:51.355 "nvme_iov_md": false 00:19:51.355 }, 00:19:51.355 "memory_domains": [ 00:19:51.355 { 00:19:51.355 "dma_device_id": "system", 00:19:51.355 "dma_device_type": 1 00:19:51.355 }, 00:19:51.355 { 00:19:51.355 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.355 "dma_device_type": 2 00:19:51.355 } 00:19:51.355 ], 00:19:51.355 "driver_specific": { 00:19:51.355 "passthru": { 00:19:51.355 "name": "pt2", 00:19:51.355 "base_bdev_name": "malloc2" 00:19:51.355 } 00:19:51.355 } 00:19:51.355 }' 00:19:51.355 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.355 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.355 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:51.355 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:51.616 17:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:51.876 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:51.876 "name": "pt3", 00:19:51.876 "aliases": [ 00:19:51.876 "00000000-0000-0000-0000-000000000003" 00:19:51.876 ], 00:19:51.876 "product_name": "passthru", 00:19:51.876 "block_size": 512, 00:19:51.876 "num_blocks": 65536, 00:19:51.876 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:51.876 "assigned_rate_limits": { 00:19:51.876 "rw_ios_per_sec": 0, 00:19:51.876 "rw_mbytes_per_sec": 0, 00:19:51.876 "r_mbytes_per_sec": 0, 00:19:51.876 "w_mbytes_per_sec": 0 00:19:51.876 }, 00:19:51.876 "claimed": true, 00:19:51.876 "claim_type": "exclusive_write", 00:19:51.876 "zoned": false, 00:19:51.876 "supported_io_types": { 00:19:51.876 "read": true, 00:19:51.876 "write": true, 00:19:51.876 "unmap": true, 00:19:51.876 "flush": true, 00:19:51.876 "reset": true, 00:19:51.876 "nvme_admin": false, 00:19:51.876 "nvme_io": false, 00:19:51.876 "nvme_io_md": false, 00:19:51.876 "write_zeroes": true, 00:19:51.876 "zcopy": true, 00:19:51.876 "get_zone_info": false, 00:19:51.876 "zone_management": false, 00:19:51.876 "zone_append": false, 00:19:51.876 "compare": false, 00:19:51.876 "compare_and_write": false, 00:19:51.876 "abort": true, 00:19:51.876 "seek_hole": false, 00:19:51.876 "seek_data": false, 00:19:51.876 "copy": true, 00:19:51.876 "nvme_iov_md": false 00:19:51.876 }, 00:19:51.876 "memory_domains": [ 00:19:51.876 { 00:19:51.876 "dma_device_id": "system", 00:19:51.876 "dma_device_type": 1 00:19:51.876 }, 00:19:51.876 { 00:19:51.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.876 "dma_device_type": 2 00:19:51.876 } 00:19:51.876 ], 00:19:51.876 "driver_specific": { 00:19:51.876 "passthru": { 00:19:51.876 "name": "pt3", 00:19:51.876 "base_bdev_name": "malloc3" 00:19:51.876 } 00:19:51.876 } 00:19:51.876 }' 00:19:51.876 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.876 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.876 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:51.876 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.876 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.876 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:51.876 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.136 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.136 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:52.136 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.136 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.136 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:52.136 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:52.136 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:52.136 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:52.395 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:52.395 "name": "pt4", 00:19:52.395 "aliases": [ 00:19:52.395 "00000000-0000-0000-0000-000000000004" 00:19:52.395 ], 00:19:52.395 "product_name": "passthru", 00:19:52.395 "block_size": 512, 00:19:52.395 "num_blocks": 65536, 00:19:52.395 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:52.395 "assigned_rate_limits": { 00:19:52.395 "rw_ios_per_sec": 0, 00:19:52.395 "rw_mbytes_per_sec": 0, 00:19:52.395 "r_mbytes_per_sec": 0, 00:19:52.395 "w_mbytes_per_sec": 0 00:19:52.395 }, 00:19:52.395 "claimed": true, 00:19:52.395 "claim_type": "exclusive_write", 00:19:52.395 "zoned": false, 00:19:52.395 "supported_io_types": { 00:19:52.395 "read": true, 00:19:52.395 "write": true, 00:19:52.395 "unmap": true, 00:19:52.395 "flush": true, 00:19:52.395 "reset": true, 00:19:52.395 "nvme_admin": false, 00:19:52.395 "nvme_io": false, 00:19:52.395 "nvme_io_md": false, 00:19:52.395 "write_zeroes": true, 00:19:52.395 "zcopy": true, 00:19:52.395 "get_zone_info": false, 00:19:52.395 "zone_management": false, 00:19:52.395 "zone_append": false, 00:19:52.395 "compare": false, 00:19:52.395 "compare_and_write": false, 00:19:52.395 "abort": true, 00:19:52.395 "seek_hole": false, 00:19:52.395 "seek_data": false, 00:19:52.395 "copy": true, 00:19:52.395 "nvme_iov_md": false 00:19:52.395 }, 00:19:52.395 "memory_domains": [ 00:19:52.395 { 00:19:52.395 "dma_device_id": "system", 00:19:52.395 "dma_device_type": 1 00:19:52.395 }, 00:19:52.395 { 00:19:52.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.395 "dma_device_type": 2 00:19:52.395 } 00:19:52.395 ], 00:19:52.395 "driver_specific": { 00:19:52.395 "passthru": { 00:19:52.395 "name": "pt4", 00:19:52.395 "base_bdev_name": "malloc4" 00:19:52.395 } 00:19:52.395 } 00:19:52.395 }' 00:19:52.395 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.395 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.395 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:52.395 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.395 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.395 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:52.395 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.655 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.655 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:52.655 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.655 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.655 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:52.655 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:52.655 17:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:52.916 [2024-07-15 17:32:04.018295] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:52.916 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7329e026-e7bf-4455-ae90-6ef2a65a1979 '!=' 7329e026-e7bf-4455-ae90-6ef2a65a1979 ']' 00:19:52.916 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:19:52.916 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:52.916 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:52.916 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:52.916 [2024-07-15 17:32:04.210551] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.176 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.176 "name": "raid_bdev1", 00:19:53.176 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:53.177 "strip_size_kb": 0, 00:19:53.177 "state": "online", 00:19:53.177 "raid_level": "raid1", 00:19:53.177 "superblock": true, 00:19:53.177 "num_base_bdevs": 4, 00:19:53.177 "num_base_bdevs_discovered": 3, 00:19:53.177 "num_base_bdevs_operational": 3, 00:19:53.177 "base_bdevs_list": [ 00:19:53.177 { 00:19:53.177 "name": null, 00:19:53.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.177 "is_configured": false, 00:19:53.177 "data_offset": 2048, 00:19:53.177 "data_size": 63488 00:19:53.177 }, 00:19:53.177 { 00:19:53.177 "name": "pt2", 00:19:53.177 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:53.177 "is_configured": true, 00:19:53.177 "data_offset": 2048, 00:19:53.177 "data_size": 63488 00:19:53.177 }, 00:19:53.177 { 00:19:53.177 "name": "pt3", 00:19:53.177 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:53.177 "is_configured": true, 00:19:53.177 "data_offset": 2048, 00:19:53.177 "data_size": 63488 00:19:53.177 }, 00:19:53.177 { 00:19:53.177 "name": "pt4", 00:19:53.177 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:53.177 "is_configured": true, 00:19:53.177 "data_offset": 2048, 00:19:53.177 "data_size": 63488 00:19:53.177 } 00:19:53.177 ] 00:19:53.177 }' 00:19:53.177 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.177 17:32:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.748 17:32:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:54.008 [2024-07-15 17:32:05.136869] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:54.008 [2024-07-15 17:32:05.136886] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:54.008 [2024-07-15 17:32:05.136921] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:54.008 [2024-07-15 17:32:05.136973] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:54.008 [2024-07-15 17:32:05.136980] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x103baa0 name raid_bdev1, state offline 00:19:54.008 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.008 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:19:54.269 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:19:54.269 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:19:54.269 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:19:54.269 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:54.269 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:54.269 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:54.269 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:54.269 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:54.530 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:54.530 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:54.530 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:54.791 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:54.791 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:54.791 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:19:54.791 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:54.791 17:32:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:55.052 [2024-07-15 17:32:06.095244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:55.052 [2024-07-15 17:32:06.095272] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:55.052 [2024-07-15 17:32:06.095281] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1042e50 00:19:55.052 [2024-07-15 17:32:06.095287] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:55.052 [2024-07-15 17:32:06.096554] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:55.052 [2024-07-15 17:32:06.096574] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:55.052 [2024-07-15 17:32:06.096620] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:55.052 [2024-07-15 17:32:06.096639] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:55.052 pt2 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:55.052 "name": "raid_bdev1", 00:19:55.052 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:55.052 "strip_size_kb": 0, 00:19:55.052 "state": "configuring", 00:19:55.052 "raid_level": "raid1", 00:19:55.052 "superblock": true, 00:19:55.052 "num_base_bdevs": 4, 00:19:55.052 "num_base_bdevs_discovered": 1, 00:19:55.052 "num_base_bdevs_operational": 3, 00:19:55.052 "base_bdevs_list": [ 00:19:55.052 { 00:19:55.052 "name": null, 00:19:55.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:55.052 "is_configured": false, 00:19:55.052 "data_offset": 2048, 00:19:55.052 "data_size": 63488 00:19:55.052 }, 00:19:55.052 { 00:19:55.052 "name": "pt2", 00:19:55.052 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:55.052 "is_configured": true, 00:19:55.052 "data_offset": 2048, 00:19:55.052 "data_size": 63488 00:19:55.052 }, 00:19:55.052 { 00:19:55.052 "name": null, 00:19:55.052 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:55.052 "is_configured": false, 00:19:55.052 "data_offset": 2048, 00:19:55.052 "data_size": 63488 00:19:55.052 }, 00:19:55.052 { 00:19:55.052 "name": null, 00:19:55.052 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:55.052 "is_configured": false, 00:19:55.052 "data_offset": 2048, 00:19:55.052 "data_size": 63488 00:19:55.052 } 00:19:55.052 ] 00:19:55.052 }' 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:55.052 17:32:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:55.622 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:19:55.622 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:55.622 17:32:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:55.883 [2024-07-15 17:32:07.021592] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:55.883 [2024-07-15 17:32:07.021621] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:55.883 [2024-07-15 17:32:07.021631] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1042990 00:19:55.883 [2024-07-15 17:32:07.021638] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:55.883 [2024-07-15 17:32:07.021896] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:55.883 [2024-07-15 17:32:07.021906] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:55.883 [2024-07-15 17:32:07.021947] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:55.883 [2024-07-15 17:32:07.021959] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:55.883 pt3 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.883 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.143 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.143 "name": "raid_bdev1", 00:19:56.143 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:56.143 "strip_size_kb": 0, 00:19:56.143 "state": "configuring", 00:19:56.143 "raid_level": "raid1", 00:19:56.143 "superblock": true, 00:19:56.143 "num_base_bdevs": 4, 00:19:56.143 "num_base_bdevs_discovered": 2, 00:19:56.144 "num_base_bdevs_operational": 3, 00:19:56.144 "base_bdevs_list": [ 00:19:56.144 { 00:19:56.144 "name": null, 00:19:56.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.144 "is_configured": false, 00:19:56.144 "data_offset": 2048, 00:19:56.144 "data_size": 63488 00:19:56.144 }, 00:19:56.144 { 00:19:56.144 "name": "pt2", 00:19:56.144 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:56.144 "is_configured": true, 00:19:56.144 "data_offset": 2048, 00:19:56.144 "data_size": 63488 00:19:56.144 }, 00:19:56.144 { 00:19:56.144 "name": "pt3", 00:19:56.144 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:56.144 "is_configured": true, 00:19:56.144 "data_offset": 2048, 00:19:56.144 "data_size": 63488 00:19:56.144 }, 00:19:56.144 { 00:19:56.144 "name": null, 00:19:56.144 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:56.144 "is_configured": false, 00:19:56.144 "data_offset": 2048, 00:19:56.144 "data_size": 63488 00:19:56.144 } 00:19:56.144 ] 00:19:56.144 }' 00:19:56.144 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.144 17:32:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:56.715 [2024-07-15 17:32:07.947934] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:56.715 [2024-07-15 17:32:07.947960] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:56.715 [2024-07-15 17:32:07.947970] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe906a0 00:19:56.715 [2024-07-15 17:32:07.947976] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:56.715 [2024-07-15 17:32:07.948224] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:56.715 [2024-07-15 17:32:07.948234] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:56.715 [2024-07-15 17:32:07.948272] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:56.715 [2024-07-15 17:32:07.948284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:56.715 [2024-07-15 17:32:07.948370] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x103fa50 00:19:56.715 [2024-07-15 17:32:07.948376] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:56.715 [2024-07-15 17:32:07.948508] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x103fd30 00:19:56.715 [2024-07-15 17:32:07.948610] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x103fa50 00:19:56.715 [2024-07-15 17:32:07.948615] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x103fa50 00:19:56.715 [2024-07-15 17:32:07.948686] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:56.715 pt4 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.715 17:32:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.976 17:32:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.976 "name": "raid_bdev1", 00:19:56.976 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:56.976 "strip_size_kb": 0, 00:19:56.976 "state": "online", 00:19:56.976 "raid_level": "raid1", 00:19:56.976 "superblock": true, 00:19:56.976 "num_base_bdevs": 4, 00:19:56.976 "num_base_bdevs_discovered": 3, 00:19:56.976 "num_base_bdevs_operational": 3, 00:19:56.976 "base_bdevs_list": [ 00:19:56.976 { 00:19:56.976 "name": null, 00:19:56.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.976 "is_configured": false, 00:19:56.976 "data_offset": 2048, 00:19:56.976 "data_size": 63488 00:19:56.976 }, 00:19:56.976 { 00:19:56.976 "name": "pt2", 00:19:56.976 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:56.976 "is_configured": true, 00:19:56.976 "data_offset": 2048, 00:19:56.976 "data_size": 63488 00:19:56.976 }, 00:19:56.976 { 00:19:56.976 "name": "pt3", 00:19:56.976 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:56.976 "is_configured": true, 00:19:56.976 "data_offset": 2048, 00:19:56.976 "data_size": 63488 00:19:56.976 }, 00:19:56.976 { 00:19:56.976 "name": "pt4", 00:19:56.976 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:56.976 "is_configured": true, 00:19:56.976 "data_offset": 2048, 00:19:56.976 "data_size": 63488 00:19:56.976 } 00:19:56.976 ] 00:19:56.976 }' 00:19:56.976 17:32:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.976 17:32:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:57.630 17:32:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:57.630 [2024-07-15 17:32:08.866240] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:57.630 [2024-07-15 17:32:08.866254] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:57.630 [2024-07-15 17:32:08.866287] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:57.630 [2024-07-15 17:32:08.866337] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:57.630 [2024-07-15 17:32:08.866344] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x103fa50 name raid_bdev1, state offline 00:19:57.630 17:32:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.630 17:32:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:19:57.889 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:19:57.889 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:19:57.889 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:19:57.889 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:19:57.889 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:58.148 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:58.409 [2024-07-15 17:32:09.503825] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:58.409 [2024-07-15 17:32:09.503851] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:58.409 [2024-07-15 17:32:09.503861] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1042730 00:19:58.409 [2024-07-15 17:32:09.503866] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:58.409 [2024-07-15 17:32:09.505116] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:58.409 [2024-07-15 17:32:09.505135] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:58.409 [2024-07-15 17:32:09.505177] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:58.409 [2024-07-15 17:32:09.505195] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:58.409 [2024-07-15 17:32:09.505266] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:19:58.409 [2024-07-15 17:32:09.505274] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:58.409 [2024-07-15 17:32:09.505283] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1040d40 name raid_bdev1, state configuring 00:19:58.409 [2024-07-15 17:32:09.505297] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:58.409 [2024-07-15 17:32:09.505353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:58.409 pt1 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.409 "name": "raid_bdev1", 00:19:58.409 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:58.409 "strip_size_kb": 0, 00:19:58.409 "state": "configuring", 00:19:58.409 "raid_level": "raid1", 00:19:58.409 "superblock": true, 00:19:58.409 "num_base_bdevs": 4, 00:19:58.409 "num_base_bdevs_discovered": 2, 00:19:58.409 "num_base_bdevs_operational": 3, 00:19:58.409 "base_bdevs_list": [ 00:19:58.409 { 00:19:58.409 "name": null, 00:19:58.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.409 "is_configured": false, 00:19:58.409 "data_offset": 2048, 00:19:58.409 "data_size": 63488 00:19:58.409 }, 00:19:58.409 { 00:19:58.409 "name": "pt2", 00:19:58.409 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:58.409 "is_configured": true, 00:19:58.409 "data_offset": 2048, 00:19:58.409 "data_size": 63488 00:19:58.409 }, 00:19:58.409 { 00:19:58.409 "name": "pt3", 00:19:58.409 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:58.409 "is_configured": true, 00:19:58.409 "data_offset": 2048, 00:19:58.409 "data_size": 63488 00:19:58.409 }, 00:19:58.409 { 00:19:58.409 "name": null, 00:19:58.409 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:58.409 "is_configured": false, 00:19:58.409 "data_offset": 2048, 00:19:58.409 "data_size": 63488 00:19:58.409 } 00:19:58.409 ] 00:19:58.409 }' 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.409 17:32:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:58.980 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:19:58.980 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:19:59.240 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:19:59.240 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:59.501 [2024-07-15 17:32:10.542464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:59.501 [2024-07-15 17:32:10.542496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:59.501 [2024-07-15 17:32:10.542507] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10423c0 00:19:59.501 [2024-07-15 17:32:10.542513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:59.501 [2024-07-15 17:32:10.542781] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:59.501 [2024-07-15 17:32:10.542791] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:59.501 [2024-07-15 17:32:10.542833] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:59.501 [2024-07-15 17:32:10.542846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:59.501 [2024-07-15 17:32:10.542931] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe90c20 00:19:59.501 [2024-07-15 17:32:10.542936] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:59.501 [2024-07-15 17:32:10.543068] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1040530 00:19:59.501 [2024-07-15 17:32:10.543171] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe90c20 00:19:59.501 [2024-07-15 17:32:10.543176] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe90c20 00:19:59.501 [2024-07-15 17:32:10.543246] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:59.501 pt4 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.501 "name": "raid_bdev1", 00:19:59.501 "uuid": "7329e026-e7bf-4455-ae90-6ef2a65a1979", 00:19:59.501 "strip_size_kb": 0, 00:19:59.501 "state": "online", 00:19:59.501 "raid_level": "raid1", 00:19:59.501 "superblock": true, 00:19:59.501 "num_base_bdevs": 4, 00:19:59.501 "num_base_bdevs_discovered": 3, 00:19:59.501 "num_base_bdevs_operational": 3, 00:19:59.501 "base_bdevs_list": [ 00:19:59.501 { 00:19:59.501 "name": null, 00:19:59.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.501 "is_configured": false, 00:19:59.501 "data_offset": 2048, 00:19:59.501 "data_size": 63488 00:19:59.501 }, 00:19:59.501 { 00:19:59.501 "name": "pt2", 00:19:59.501 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:59.501 "is_configured": true, 00:19:59.501 "data_offset": 2048, 00:19:59.501 "data_size": 63488 00:19:59.501 }, 00:19:59.501 { 00:19:59.501 "name": "pt3", 00:19:59.501 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:59.501 "is_configured": true, 00:19:59.501 "data_offset": 2048, 00:19:59.501 "data_size": 63488 00:19:59.501 }, 00:19:59.501 { 00:19:59.501 "name": "pt4", 00:19:59.501 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:59.501 "is_configured": true, 00:19:59.501 "data_offset": 2048, 00:19:59.501 "data_size": 63488 00:19:59.501 } 00:19:59.501 ] 00:19:59.501 }' 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.501 17:32:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.072 17:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:20:00.072 17:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:20:00.333 17:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:20:00.333 17:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:00.333 17:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:20:00.594 [2024-07-15 17:32:11.645472] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 7329e026-e7bf-4455-ae90-6ef2a65a1979 '!=' 7329e026-e7bf-4455-ae90-6ef2a65a1979 ']' 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2848858 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2848858 ']' 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2848858 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2848858 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2848858' 00:20:00.594 killing process with pid 2848858 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2848858 00:20:00.594 [2024-07-15 17:32:11.734780] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:00.594 [2024-07-15 17:32:11.734817] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:00.594 [2024-07-15 17:32:11.734863] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:00.594 [2024-07-15 17:32:11.734870] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe90c20 name raid_bdev1, state offline 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2848858 00:20:00.594 [2024-07-15 17:32:11.755149] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:00.594 00:20:00.594 real 0m21.379s 00:20:00.594 user 0m39.882s 00:20:00.594 sys 0m3.217s 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:00.594 17:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.594 ************************************ 00:20:00.594 END TEST raid_superblock_test 00:20:00.594 ************************************ 00:20:00.855 17:32:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:00.855 17:32:11 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:20:00.855 17:32:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:00.855 17:32:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:00.855 17:32:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:00.855 ************************************ 00:20:00.855 START TEST raid_read_error_test 00:20:00.855 ************************************ 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.SodfSJhYUH 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2852996 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2852996 /var/tmp/spdk-raid.sock 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2852996 ']' 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:00.855 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:00.855 17:32:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.855 [2024-07-15 17:32:12.019584] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:20:00.855 [2024-07-15 17:32:12.019630] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2852996 ] 00:20:00.855 [2024-07-15 17:32:12.105937] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:01.115 [2024-07-15 17:32:12.168281] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:01.115 [2024-07-15 17:32:12.207701] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:01.115 [2024-07-15 17:32:12.207727] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:01.684 17:32:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:01.684 17:32:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:01.685 17:32:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:01.685 17:32:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:01.945 BaseBdev1_malloc 00:20:01.945 17:32:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:01.945 true 00:20:01.945 17:32:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:02.205 [2024-07-15 17:32:13.402154] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:02.205 [2024-07-15 17:32:13.402183] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.205 [2024-07-15 17:32:13.402194] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1017b50 00:20:02.205 [2024-07-15 17:32:13.402201] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.205 [2024-07-15 17:32:13.403536] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.205 [2024-07-15 17:32:13.403555] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:02.205 BaseBdev1 00:20:02.205 17:32:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:02.205 17:32:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:02.465 BaseBdev2_malloc 00:20:02.465 17:32:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:02.725 true 00:20:02.725 17:32:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:02.725 [2024-07-15 17:32:13.973390] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:02.725 [2024-07-15 17:32:13.973418] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.725 [2024-07-15 17:32:13.973429] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xffbea0 00:20:02.725 [2024-07-15 17:32:13.973435] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.725 [2024-07-15 17:32:13.974603] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.725 [2024-07-15 17:32:13.974621] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:02.725 BaseBdev2 00:20:02.725 17:32:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:02.725 17:32:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:02.986 BaseBdev3_malloc 00:20:02.986 17:32:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:03.247 true 00:20:03.247 17:32:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:03.247 [2024-07-15 17:32:14.540497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:03.247 [2024-07-15 17:32:14.540523] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:03.247 [2024-07-15 17:32:14.540533] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xffffb0 00:20:03.247 [2024-07-15 17:32:14.540540] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:03.247 [2024-07-15 17:32:14.541701] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:03.247 [2024-07-15 17:32:14.541726] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:03.507 BaseBdev3 00:20:03.507 17:32:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:03.507 17:32:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:03.507 BaseBdev4_malloc 00:20:03.507 17:32:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:03.766 true 00:20:03.766 17:32:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:04.027 [2024-07-15 17:32:15.099847] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:04.027 [2024-07-15 17:32:15.099875] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:04.027 [2024-07-15 17:32:15.099886] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1001980 00:20:04.027 [2024-07-15 17:32:15.099893] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:04.027 [2024-07-15 17:32:15.101070] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:04.027 [2024-07-15 17:32:15.101088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:04.027 BaseBdev4 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:04.027 [2024-07-15 17:32:15.292359] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:04.027 [2024-07-15 17:32:15.293366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:04.027 [2024-07-15 17:32:15.293418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:04.027 [2024-07-15 17:32:15.293463] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:04.027 [2024-07-15 17:32:15.293640] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10014e0 00:20:04.027 [2024-07-15 17:32:15.293647] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:04.027 [2024-07-15 17:32:15.293798] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe63210 00:20:04.027 [2024-07-15 17:32:15.293918] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10014e0 00:20:04.027 [2024-07-15 17:32:15.293924] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10014e0 00:20:04.027 [2024-07-15 17:32:15.293998] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.027 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:04.288 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.288 "name": "raid_bdev1", 00:20:04.288 "uuid": "a6c60afc-d4cd-4992-a2f2-aef02e42291e", 00:20:04.288 "strip_size_kb": 0, 00:20:04.288 "state": "online", 00:20:04.288 "raid_level": "raid1", 00:20:04.288 "superblock": true, 00:20:04.288 "num_base_bdevs": 4, 00:20:04.288 "num_base_bdevs_discovered": 4, 00:20:04.288 "num_base_bdevs_operational": 4, 00:20:04.288 "base_bdevs_list": [ 00:20:04.288 { 00:20:04.288 "name": "BaseBdev1", 00:20:04.288 "uuid": "f90d87de-e857-5cec-8581-6bdfe50d3d31", 00:20:04.288 "is_configured": true, 00:20:04.288 "data_offset": 2048, 00:20:04.288 "data_size": 63488 00:20:04.288 }, 00:20:04.288 { 00:20:04.288 "name": "BaseBdev2", 00:20:04.288 "uuid": "535a7340-fbda-546e-976e-df676ba670dd", 00:20:04.288 "is_configured": true, 00:20:04.288 "data_offset": 2048, 00:20:04.288 "data_size": 63488 00:20:04.288 }, 00:20:04.288 { 00:20:04.288 "name": "BaseBdev3", 00:20:04.288 "uuid": "fad5e14a-092e-51c6-a560-fe36e1906dba", 00:20:04.288 "is_configured": true, 00:20:04.288 "data_offset": 2048, 00:20:04.288 "data_size": 63488 00:20:04.288 }, 00:20:04.288 { 00:20:04.288 "name": "BaseBdev4", 00:20:04.288 "uuid": "8d4bf809-cfb6-58da-90f4-fba595fc804b", 00:20:04.288 "is_configured": true, 00:20:04.288 "data_offset": 2048, 00:20:04.288 "data_size": 63488 00:20:04.288 } 00:20:04.288 ] 00:20:04.288 }' 00:20:04.288 17:32:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.288 17:32:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:04.859 17:32:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:04.859 17:32:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:04.859 [2024-07-15 17:32:16.126673] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1006a50 00:20:05.799 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.058 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:06.317 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.317 "name": "raid_bdev1", 00:20:06.317 "uuid": "a6c60afc-d4cd-4992-a2f2-aef02e42291e", 00:20:06.317 "strip_size_kb": 0, 00:20:06.317 "state": "online", 00:20:06.317 "raid_level": "raid1", 00:20:06.317 "superblock": true, 00:20:06.317 "num_base_bdevs": 4, 00:20:06.317 "num_base_bdevs_discovered": 4, 00:20:06.317 "num_base_bdevs_operational": 4, 00:20:06.317 "base_bdevs_list": [ 00:20:06.317 { 00:20:06.317 "name": "BaseBdev1", 00:20:06.317 "uuid": "f90d87de-e857-5cec-8581-6bdfe50d3d31", 00:20:06.317 "is_configured": true, 00:20:06.317 "data_offset": 2048, 00:20:06.317 "data_size": 63488 00:20:06.317 }, 00:20:06.317 { 00:20:06.317 "name": "BaseBdev2", 00:20:06.317 "uuid": "535a7340-fbda-546e-976e-df676ba670dd", 00:20:06.317 "is_configured": true, 00:20:06.317 "data_offset": 2048, 00:20:06.317 "data_size": 63488 00:20:06.317 }, 00:20:06.317 { 00:20:06.317 "name": "BaseBdev3", 00:20:06.317 "uuid": "fad5e14a-092e-51c6-a560-fe36e1906dba", 00:20:06.317 "is_configured": true, 00:20:06.317 "data_offset": 2048, 00:20:06.317 "data_size": 63488 00:20:06.317 }, 00:20:06.317 { 00:20:06.317 "name": "BaseBdev4", 00:20:06.317 "uuid": "8d4bf809-cfb6-58da-90f4-fba595fc804b", 00:20:06.317 "is_configured": true, 00:20:06.317 "data_offset": 2048, 00:20:06.317 "data_size": 63488 00:20:06.317 } 00:20:06.317 ] 00:20:06.317 }' 00:20:06.317 17:32:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.317 17:32:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.885 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:06.885 [2024-07-15 17:32:18.181259] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:06.885 [2024-07-15 17:32:18.181292] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:07.143 [2024-07-15 17:32:18.183942] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:07.143 [2024-07-15 17:32:18.183976] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:07.143 [2024-07-15 17:32:18.184067] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:07.143 [2024-07-15 17:32:18.184073] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10014e0 name raid_bdev1, state offline 00:20:07.143 0 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2852996 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2852996 ']' 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2852996 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2852996 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2852996' 00:20:07.143 killing process with pid 2852996 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2852996 00:20:07.143 [2024-07-15 17:32:18.269919] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2852996 00:20:07.143 [2024-07-15 17:32:18.287038] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.SodfSJhYUH 00:20:07.143 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:07.144 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:07.144 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:20:07.144 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:20:07.144 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:07.144 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:07.144 17:32:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:20:07.144 00:20:07.144 real 0m6.470s 00:20:07.144 user 0m10.447s 00:20:07.144 sys 0m0.892s 00:20:07.144 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:07.144 17:32:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.144 ************************************ 00:20:07.144 END TEST raid_read_error_test 00:20:07.144 ************************************ 00:20:07.402 17:32:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:07.402 17:32:18 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:20:07.402 17:32:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:07.402 17:32:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:07.402 17:32:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:07.402 ************************************ 00:20:07.402 START TEST raid_write_error_test 00:20:07.402 ************************************ 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.VlrrPdvTqz 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2854216 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2854216 /var/tmp/spdk-raid.sock 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2854216 ']' 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:07.402 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:07.402 17:32:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.402 [2024-07-15 17:32:18.566585] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:20:07.402 [2024-07-15 17:32:18.566640] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2854216 ] 00:20:07.402 [2024-07-15 17:32:18.654579] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:07.661 [2024-07-15 17:32:18.723177] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:07.661 [2024-07-15 17:32:18.768802] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:07.661 [2024-07-15 17:32:18.768826] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:08.229 17:32:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:08.229 17:32:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:08.229 17:32:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:08.229 17:32:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:08.487 BaseBdev1_malloc 00:20:08.487 17:32:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:08.487 true 00:20:08.745 17:32:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:08.745 [2024-07-15 17:32:19.951861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:08.745 [2024-07-15 17:32:19.951891] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:08.745 [2024-07-15 17:32:19.951902] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x117db50 00:20:08.745 [2024-07-15 17:32:19.951909] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:08.745 [2024-07-15 17:32:19.953218] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:08.746 [2024-07-15 17:32:19.953237] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:08.746 BaseBdev1 00:20:08.746 17:32:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:08.746 17:32:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:09.004 BaseBdev2_malloc 00:20:09.004 17:32:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:09.264 true 00:20:09.264 17:32:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:09.264 [2024-07-15 17:32:20.527167] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:09.264 [2024-07-15 17:32:20.527196] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.264 [2024-07-15 17:32:20.527207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1161ea0 00:20:09.264 [2024-07-15 17:32:20.527214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.264 [2024-07-15 17:32:20.528414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.264 [2024-07-15 17:32:20.528433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:09.264 BaseBdev2 00:20:09.264 17:32:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:09.264 17:32:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:09.522 BaseBdev3_malloc 00:20:09.522 17:32:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:09.780 true 00:20:09.780 17:32:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:10.039 [2024-07-15 17:32:21.106538] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:10.040 [2024-07-15 17:32:21.106564] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:10.040 [2024-07-15 17:32:21.106578] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1165fb0 00:20:10.040 [2024-07-15 17:32:21.106584] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:10.040 [2024-07-15 17:32:21.107786] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:10.040 [2024-07-15 17:32:21.107805] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:10.040 BaseBdev3 00:20:10.040 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:10.040 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:10.040 BaseBdev4_malloc 00:20:10.040 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:10.300 true 00:20:10.300 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:10.560 [2024-07-15 17:32:21.653778] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:10.560 [2024-07-15 17:32:21.653802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:10.560 [2024-07-15 17:32:21.653813] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1167980 00:20:10.560 [2024-07-15 17:32:21.653819] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:10.560 [2024-07-15 17:32:21.654977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:10.560 [2024-07-15 17:32:21.654995] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:10.560 BaseBdev4 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:10.560 [2024-07-15 17:32:21.834260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:10.560 [2024-07-15 17:32:21.835282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:10.560 [2024-07-15 17:32:21.835334] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:10.560 [2024-07-15 17:32:21.835379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:10.560 [2024-07-15 17:32:21.835555] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11674e0 00:20:10.560 [2024-07-15 17:32:21.835562] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:10.560 [2024-07-15 17:32:21.835706] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc9210 00:20:10.560 [2024-07-15 17:32:21.835835] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11674e0 00:20:10.560 [2024-07-15 17:32:21.835841] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11674e0 00:20:10.560 [2024-07-15 17:32:21.835915] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.560 17:32:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:10.820 17:32:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.820 "name": "raid_bdev1", 00:20:10.820 "uuid": "6d029521-33a6-4662-b5e7-a8f771af8a3a", 00:20:10.820 "strip_size_kb": 0, 00:20:10.820 "state": "online", 00:20:10.820 "raid_level": "raid1", 00:20:10.820 "superblock": true, 00:20:10.820 "num_base_bdevs": 4, 00:20:10.820 "num_base_bdevs_discovered": 4, 00:20:10.820 "num_base_bdevs_operational": 4, 00:20:10.820 "base_bdevs_list": [ 00:20:10.820 { 00:20:10.820 "name": "BaseBdev1", 00:20:10.820 "uuid": "e25ca742-572c-5704-9e02-726bce303c25", 00:20:10.820 "is_configured": true, 00:20:10.820 "data_offset": 2048, 00:20:10.820 "data_size": 63488 00:20:10.820 }, 00:20:10.820 { 00:20:10.820 "name": "BaseBdev2", 00:20:10.820 "uuid": "db26b1bc-f39a-549b-b3c4-2f50b09e9e43", 00:20:10.820 "is_configured": true, 00:20:10.820 "data_offset": 2048, 00:20:10.820 "data_size": 63488 00:20:10.820 }, 00:20:10.820 { 00:20:10.820 "name": "BaseBdev3", 00:20:10.820 "uuid": "8bb29546-f9cf-5df4-b512-7f3f07170685", 00:20:10.820 "is_configured": true, 00:20:10.820 "data_offset": 2048, 00:20:10.820 "data_size": 63488 00:20:10.820 }, 00:20:10.820 { 00:20:10.820 "name": "BaseBdev4", 00:20:10.820 "uuid": "57257270-89fe-59f4-a55f-1dac056a0df6", 00:20:10.820 "is_configured": true, 00:20:10.820 "data_offset": 2048, 00:20:10.820 "data_size": 63488 00:20:10.820 } 00:20:10.820 ] 00:20:10.820 }' 00:20:10.820 17:32:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.820 17:32:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:11.423 17:32:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:11.423 17:32:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:11.423 [2024-07-15 17:32:22.684627] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x116ca50 00:20:12.362 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:12.623 [2024-07-15 17:32:23.778870] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:20:12.623 [2024-07-15 17:32:23.778917] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:12.623 [2024-07-15 17:32:23.779108] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x116ca50 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.623 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:12.884 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.884 "name": "raid_bdev1", 00:20:12.884 "uuid": "6d029521-33a6-4662-b5e7-a8f771af8a3a", 00:20:12.884 "strip_size_kb": 0, 00:20:12.884 "state": "online", 00:20:12.884 "raid_level": "raid1", 00:20:12.884 "superblock": true, 00:20:12.884 "num_base_bdevs": 4, 00:20:12.884 "num_base_bdevs_discovered": 3, 00:20:12.884 "num_base_bdevs_operational": 3, 00:20:12.884 "base_bdevs_list": [ 00:20:12.884 { 00:20:12.884 "name": null, 00:20:12.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.884 "is_configured": false, 00:20:12.884 "data_offset": 2048, 00:20:12.884 "data_size": 63488 00:20:12.884 }, 00:20:12.884 { 00:20:12.884 "name": "BaseBdev2", 00:20:12.884 "uuid": "db26b1bc-f39a-549b-b3c4-2f50b09e9e43", 00:20:12.884 "is_configured": true, 00:20:12.884 "data_offset": 2048, 00:20:12.884 "data_size": 63488 00:20:12.884 }, 00:20:12.884 { 00:20:12.884 "name": "BaseBdev3", 00:20:12.884 "uuid": "8bb29546-f9cf-5df4-b512-7f3f07170685", 00:20:12.884 "is_configured": true, 00:20:12.884 "data_offset": 2048, 00:20:12.884 "data_size": 63488 00:20:12.884 }, 00:20:12.884 { 00:20:12.884 "name": "BaseBdev4", 00:20:12.884 "uuid": "57257270-89fe-59f4-a55f-1dac056a0df6", 00:20:12.884 "is_configured": true, 00:20:12.884 "data_offset": 2048, 00:20:12.884 "data_size": 63488 00:20:12.884 } 00:20:12.884 ] 00:20:12.884 }' 00:20:12.884 17:32:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.884 17:32:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.455 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:13.455 [2024-07-15 17:32:24.700278] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:13.455 [2024-07-15 17:32:24.700309] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:13.455 [2024-07-15 17:32:24.702869] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:13.455 [2024-07-15 17:32:24.702896] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:13.455 [2024-07-15 17:32:24.702977] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:13.455 [2024-07-15 17:32:24.702984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11674e0 name raid_bdev1, state offline 00:20:13.455 0 00:20:13.455 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2854216 00:20:13.455 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2854216 ']' 00:20:13.455 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2854216 00:20:13.455 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:13.455 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:13.455 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2854216 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2854216' 00:20:13.715 killing process with pid 2854216 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2854216 00:20:13.715 [2024-07-15 17:32:24.786031] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2854216 00:20:13.715 [2024-07-15 17:32:24.802962] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.VlrrPdvTqz 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:20:13.715 00:20:13.715 real 0m6.439s 00:20:13.715 user 0m10.344s 00:20:13.715 sys 0m0.914s 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:13.715 17:32:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.715 ************************************ 00:20:13.715 END TEST raid_write_error_test 00:20:13.715 ************************************ 00:20:13.715 17:32:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:13.715 17:32:24 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:20:13.715 17:32:24 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:20:13.715 17:32:24 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:20:13.715 17:32:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:13.715 17:32:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:13.716 17:32:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:13.976 ************************************ 00:20:13.977 START TEST raid_rebuild_test 00:20:13.977 ************************************ 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2855333 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2855333 /var/tmp/spdk-raid.sock 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2855333 ']' 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:13.977 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:13.977 17:32:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.977 [2024-07-15 17:32:25.076341] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:20:13.977 [2024-07-15 17:32:25.076386] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2855333 ] 00:20:13.977 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:13.977 Zero copy mechanism will not be used. 00:20:13.977 [2024-07-15 17:32:25.157228] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:13.977 [2024-07-15 17:32:25.221742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:13.977 [2024-07-15 17:32:25.262681] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:13.977 [2024-07-15 17:32:25.262704] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:14.919 17:32:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:14.919 17:32:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:20:14.919 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:14.919 17:32:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:14.919 BaseBdev1_malloc 00:20:14.919 17:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:15.180 [2024-07-15 17:32:26.313240] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:15.180 [2024-07-15 17:32:26.313277] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:15.180 [2024-07-15 17:32:26.313290] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19e4d30 00:20:15.180 [2024-07-15 17:32:26.313296] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:15.180 [2024-07-15 17:32:26.314590] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:15.180 [2024-07-15 17:32:26.314611] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:15.180 BaseBdev1 00:20:15.180 17:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:15.180 17:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:15.441 BaseBdev2_malloc 00:20:15.441 17:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:15.441 [2024-07-15 17:32:26.684025] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:15.441 [2024-07-15 17:32:26.684053] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:15.441 [2024-07-15 17:32:26.684063] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b97c60 00:20:15.441 [2024-07-15 17:32:26.684070] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:15.441 [2024-07-15 17:32:26.685231] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:15.441 [2024-07-15 17:32:26.685249] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:15.441 BaseBdev2 00:20:15.441 17:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:15.701 spare_malloc 00:20:15.701 17:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:15.962 spare_delay 00:20:15.962 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:15.962 [2024-07-15 17:32:27.239183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:15.962 [2024-07-15 17:32:27.239209] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:15.962 [2024-07-15 17:32:27.239219] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b87ec0 00:20:15.962 [2024-07-15 17:32:27.239225] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:15.962 [2024-07-15 17:32:27.240370] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:15.962 [2024-07-15 17:32:27.240389] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:15.962 spare 00:20:15.962 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:16.223 [2024-07-15 17:32:27.423663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:16.223 [2024-07-15 17:32:27.424644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:16.223 [2024-07-15 17:32:27.424705] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b7f390 00:20:16.223 [2024-07-15 17:32:27.424719] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:16.223 [2024-07-15 17:32:27.424868] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19df7f0 00:20:16.223 [2024-07-15 17:32:27.424975] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b7f390 00:20:16.223 [2024-07-15 17:32:27.424985] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b7f390 00:20:16.223 [2024-07-15 17:32:27.425064] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.223 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:16.484 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:16.484 "name": "raid_bdev1", 00:20:16.484 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:16.484 "strip_size_kb": 0, 00:20:16.484 "state": "online", 00:20:16.484 "raid_level": "raid1", 00:20:16.484 "superblock": false, 00:20:16.484 "num_base_bdevs": 2, 00:20:16.484 "num_base_bdevs_discovered": 2, 00:20:16.484 "num_base_bdevs_operational": 2, 00:20:16.484 "base_bdevs_list": [ 00:20:16.484 { 00:20:16.484 "name": "BaseBdev1", 00:20:16.484 "uuid": "6c3d6739-f2ff-58b6-8bc6-02dd062b0189", 00:20:16.484 "is_configured": true, 00:20:16.484 "data_offset": 0, 00:20:16.484 "data_size": 65536 00:20:16.484 }, 00:20:16.484 { 00:20:16.484 "name": "BaseBdev2", 00:20:16.484 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:16.484 "is_configured": true, 00:20:16.484 "data_offset": 0, 00:20:16.484 "data_size": 65536 00:20:16.484 } 00:20:16.484 ] 00:20:16.484 }' 00:20:16.484 17:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:16.484 17:32:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:17.055 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:17.055 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:17.314 [2024-07-15 17:32:28.382279] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:17.314 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:20:17.314 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.314 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:17.574 [2024-07-15 17:32:28.791127] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b7e3c0 00:20:17.574 /dev/nbd0 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:17.574 1+0 records in 00:20:17.574 1+0 records out 00:20:17.574 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265924 s, 15.4 MB/s 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:20:17.574 17:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:20:22.856 65536+0 records in 00:20:22.856 65536+0 records out 00:20:22.856 33554432 bytes (34 MB, 32 MiB) copied, 4.30833 s, 7.8 MB/s 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:22.856 [2024-07-15 17:32:33.351378] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:22.856 [2024-07-15 17:32:33.515819] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.856 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.856 "name": "raid_bdev1", 00:20:22.856 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:22.856 "strip_size_kb": 0, 00:20:22.856 "state": "online", 00:20:22.856 "raid_level": "raid1", 00:20:22.856 "superblock": false, 00:20:22.856 "num_base_bdevs": 2, 00:20:22.856 "num_base_bdevs_discovered": 1, 00:20:22.856 "num_base_bdevs_operational": 1, 00:20:22.856 "base_bdevs_list": [ 00:20:22.856 { 00:20:22.857 "name": null, 00:20:22.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.857 "is_configured": false, 00:20:22.857 "data_offset": 0, 00:20:22.857 "data_size": 65536 00:20:22.857 }, 00:20:22.857 { 00:20:22.857 "name": "BaseBdev2", 00:20:22.857 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:22.857 "is_configured": true, 00:20:22.857 "data_offset": 0, 00:20:22.857 "data_size": 65536 00:20:22.857 } 00:20:22.857 ] 00:20:22.857 }' 00:20:22.857 17:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.857 17:32:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.117 17:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:23.377 [2024-07-15 17:32:34.430132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:23.377 [2024-07-15 17:32:34.433572] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19df720 00:20:23.377 [2024-07-15 17:32:34.435164] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:23.377 17:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:24.318 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:24.318 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:24.318 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:24.318 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:24.318 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:24.318 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.318 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.579 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:24.579 "name": "raid_bdev1", 00:20:24.579 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:24.579 "strip_size_kb": 0, 00:20:24.579 "state": "online", 00:20:24.579 "raid_level": "raid1", 00:20:24.579 "superblock": false, 00:20:24.579 "num_base_bdevs": 2, 00:20:24.579 "num_base_bdevs_discovered": 2, 00:20:24.579 "num_base_bdevs_operational": 2, 00:20:24.579 "process": { 00:20:24.579 "type": "rebuild", 00:20:24.579 "target": "spare", 00:20:24.579 "progress": { 00:20:24.579 "blocks": 22528, 00:20:24.579 "percent": 34 00:20:24.579 } 00:20:24.579 }, 00:20:24.579 "base_bdevs_list": [ 00:20:24.579 { 00:20:24.579 "name": "spare", 00:20:24.579 "uuid": "ac3fe9a7-04dd-5e37-a187-7ec6275eb4b7", 00:20:24.579 "is_configured": true, 00:20:24.579 "data_offset": 0, 00:20:24.579 "data_size": 65536 00:20:24.579 }, 00:20:24.579 { 00:20:24.579 "name": "BaseBdev2", 00:20:24.579 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:24.579 "is_configured": true, 00:20:24.579 "data_offset": 0, 00:20:24.579 "data_size": 65536 00:20:24.579 } 00:20:24.579 ] 00:20:24.579 }' 00:20:24.579 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:24.579 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:24.579 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:24.579 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:24.579 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:24.839 [2024-07-15 17:32:35.915420] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:24.840 [2024-07-15 17:32:35.944096] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:24.840 [2024-07-15 17:32:35.944126] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:24.840 [2024-07-15 17:32:35.944136] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:24.840 [2024-07-15 17:32:35.944140] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.840 17:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.100 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.100 "name": "raid_bdev1", 00:20:25.100 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:25.100 "strip_size_kb": 0, 00:20:25.100 "state": "online", 00:20:25.100 "raid_level": "raid1", 00:20:25.100 "superblock": false, 00:20:25.100 "num_base_bdevs": 2, 00:20:25.100 "num_base_bdevs_discovered": 1, 00:20:25.100 "num_base_bdevs_operational": 1, 00:20:25.100 "base_bdevs_list": [ 00:20:25.100 { 00:20:25.100 "name": null, 00:20:25.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.100 "is_configured": false, 00:20:25.100 "data_offset": 0, 00:20:25.100 "data_size": 65536 00:20:25.100 }, 00:20:25.100 { 00:20:25.100 "name": "BaseBdev2", 00:20:25.100 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:25.100 "is_configured": true, 00:20:25.100 "data_offset": 0, 00:20:25.100 "data_size": 65536 00:20:25.100 } 00:20:25.100 ] 00:20:25.100 }' 00:20:25.100 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.100 17:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:25.671 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:25.671 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:25.671 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:25.671 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:25.671 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:25.671 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.671 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.671 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:25.671 "name": "raid_bdev1", 00:20:25.671 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:25.671 "strip_size_kb": 0, 00:20:25.671 "state": "online", 00:20:25.671 "raid_level": "raid1", 00:20:25.671 "superblock": false, 00:20:25.671 "num_base_bdevs": 2, 00:20:25.671 "num_base_bdevs_discovered": 1, 00:20:25.671 "num_base_bdevs_operational": 1, 00:20:25.671 "base_bdevs_list": [ 00:20:25.671 { 00:20:25.671 "name": null, 00:20:25.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.671 "is_configured": false, 00:20:25.671 "data_offset": 0, 00:20:25.671 "data_size": 65536 00:20:25.671 }, 00:20:25.671 { 00:20:25.671 "name": "BaseBdev2", 00:20:25.671 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:25.671 "is_configured": true, 00:20:25.671 "data_offset": 0, 00:20:25.671 "data_size": 65536 00:20:25.671 } 00:20:25.671 ] 00:20:25.671 }' 00:20:25.671 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:25.932 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:25.932 17:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:25.932 17:32:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:25.932 17:32:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:25.932 [2024-07-15 17:32:37.211222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:25.932 [2024-07-15 17:32:37.214567] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19dcc70 00:20:25.932 [2024-07-15 17:32:37.215723] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:25.932 17:32:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:27.314 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:27.314 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:27.314 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:27.314 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:27.314 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:27.315 "name": "raid_bdev1", 00:20:27.315 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:27.315 "strip_size_kb": 0, 00:20:27.315 "state": "online", 00:20:27.315 "raid_level": "raid1", 00:20:27.315 "superblock": false, 00:20:27.315 "num_base_bdevs": 2, 00:20:27.315 "num_base_bdevs_discovered": 2, 00:20:27.315 "num_base_bdevs_operational": 2, 00:20:27.315 "process": { 00:20:27.315 "type": "rebuild", 00:20:27.315 "target": "spare", 00:20:27.315 "progress": { 00:20:27.315 "blocks": 22528, 00:20:27.315 "percent": 34 00:20:27.315 } 00:20:27.315 }, 00:20:27.315 "base_bdevs_list": [ 00:20:27.315 { 00:20:27.315 "name": "spare", 00:20:27.315 "uuid": "ac3fe9a7-04dd-5e37-a187-7ec6275eb4b7", 00:20:27.315 "is_configured": true, 00:20:27.315 "data_offset": 0, 00:20:27.315 "data_size": 65536 00:20:27.315 }, 00:20:27.315 { 00:20:27.315 "name": "BaseBdev2", 00:20:27.315 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:27.315 "is_configured": true, 00:20:27.315 "data_offset": 0, 00:20:27.315 "data_size": 65536 00:20:27.315 } 00:20:27.315 ] 00:20:27.315 }' 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=652 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.315 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.575 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:27.575 "name": "raid_bdev1", 00:20:27.575 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:27.575 "strip_size_kb": 0, 00:20:27.575 "state": "online", 00:20:27.575 "raid_level": "raid1", 00:20:27.575 "superblock": false, 00:20:27.575 "num_base_bdevs": 2, 00:20:27.575 "num_base_bdevs_discovered": 2, 00:20:27.575 "num_base_bdevs_operational": 2, 00:20:27.575 "process": { 00:20:27.575 "type": "rebuild", 00:20:27.575 "target": "spare", 00:20:27.575 "progress": { 00:20:27.575 "blocks": 28672, 00:20:27.575 "percent": 43 00:20:27.575 } 00:20:27.575 }, 00:20:27.575 "base_bdevs_list": [ 00:20:27.575 { 00:20:27.575 "name": "spare", 00:20:27.575 "uuid": "ac3fe9a7-04dd-5e37-a187-7ec6275eb4b7", 00:20:27.575 "is_configured": true, 00:20:27.575 "data_offset": 0, 00:20:27.575 "data_size": 65536 00:20:27.575 }, 00:20:27.575 { 00:20:27.575 "name": "BaseBdev2", 00:20:27.575 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:27.575 "is_configured": true, 00:20:27.575 "data_offset": 0, 00:20:27.575 "data_size": 65536 00:20:27.575 } 00:20:27.575 ] 00:20:27.575 }' 00:20:27.575 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:27.575 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:27.575 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:27.575 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:27.575 17:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:28.516 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:28.516 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:28.516 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:28.516 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:28.516 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:28.516 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:28.516 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.517 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.776 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:28.776 "name": "raid_bdev1", 00:20:28.776 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:28.776 "strip_size_kb": 0, 00:20:28.776 "state": "online", 00:20:28.776 "raid_level": "raid1", 00:20:28.776 "superblock": false, 00:20:28.776 "num_base_bdevs": 2, 00:20:28.776 "num_base_bdevs_discovered": 2, 00:20:28.776 "num_base_bdevs_operational": 2, 00:20:28.776 "process": { 00:20:28.776 "type": "rebuild", 00:20:28.776 "target": "spare", 00:20:28.776 "progress": { 00:20:28.776 "blocks": 55296, 00:20:28.776 "percent": 84 00:20:28.776 } 00:20:28.776 }, 00:20:28.776 "base_bdevs_list": [ 00:20:28.776 { 00:20:28.776 "name": "spare", 00:20:28.776 "uuid": "ac3fe9a7-04dd-5e37-a187-7ec6275eb4b7", 00:20:28.776 "is_configured": true, 00:20:28.776 "data_offset": 0, 00:20:28.776 "data_size": 65536 00:20:28.776 }, 00:20:28.776 { 00:20:28.776 "name": "BaseBdev2", 00:20:28.776 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:28.776 "is_configured": true, 00:20:28.776 "data_offset": 0, 00:20:28.776 "data_size": 65536 00:20:28.776 } 00:20:28.776 ] 00:20:28.776 }' 00:20:28.776 17:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:28.776 17:32:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:28.776 17:32:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:29.035 17:32:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:29.035 17:32:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:29.305 [2024-07-15 17:32:40.434523] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:29.305 [2024-07-15 17:32:40.434566] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:29.305 [2024-07-15 17:32:40.434594] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:29.929 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:29.929 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:29.929 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:29.929 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:29.929 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:29.929 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:29.929 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.929 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.187 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:30.187 "name": "raid_bdev1", 00:20:30.187 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:30.187 "strip_size_kb": 0, 00:20:30.187 "state": "online", 00:20:30.187 "raid_level": "raid1", 00:20:30.187 "superblock": false, 00:20:30.187 "num_base_bdevs": 2, 00:20:30.187 "num_base_bdevs_discovered": 2, 00:20:30.187 "num_base_bdevs_operational": 2, 00:20:30.187 "base_bdevs_list": [ 00:20:30.187 { 00:20:30.187 "name": "spare", 00:20:30.187 "uuid": "ac3fe9a7-04dd-5e37-a187-7ec6275eb4b7", 00:20:30.187 "is_configured": true, 00:20:30.187 "data_offset": 0, 00:20:30.187 "data_size": 65536 00:20:30.187 }, 00:20:30.187 { 00:20:30.187 "name": "BaseBdev2", 00:20:30.187 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:30.187 "is_configured": true, 00:20:30.187 "data_offset": 0, 00:20:30.188 "data_size": 65536 00:20:30.188 } 00:20:30.188 ] 00:20:30.188 }' 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.188 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:30.446 "name": "raid_bdev1", 00:20:30.446 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:30.446 "strip_size_kb": 0, 00:20:30.446 "state": "online", 00:20:30.446 "raid_level": "raid1", 00:20:30.446 "superblock": false, 00:20:30.446 "num_base_bdevs": 2, 00:20:30.446 "num_base_bdevs_discovered": 2, 00:20:30.446 "num_base_bdevs_operational": 2, 00:20:30.446 "base_bdevs_list": [ 00:20:30.446 { 00:20:30.446 "name": "spare", 00:20:30.446 "uuid": "ac3fe9a7-04dd-5e37-a187-7ec6275eb4b7", 00:20:30.446 "is_configured": true, 00:20:30.446 "data_offset": 0, 00:20:30.446 "data_size": 65536 00:20:30.446 }, 00:20:30.446 { 00:20:30.446 "name": "BaseBdev2", 00:20:30.446 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:30.446 "is_configured": true, 00:20:30.446 "data_offset": 0, 00:20:30.446 "data_size": 65536 00:20:30.446 } 00:20:30.446 ] 00:20:30.446 }' 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.446 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.706 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:30.706 "name": "raid_bdev1", 00:20:30.706 "uuid": "c83f81af-425c-4254-8259-f2157020073d", 00:20:30.706 "strip_size_kb": 0, 00:20:30.706 "state": "online", 00:20:30.706 "raid_level": "raid1", 00:20:30.706 "superblock": false, 00:20:30.706 "num_base_bdevs": 2, 00:20:30.706 "num_base_bdevs_discovered": 2, 00:20:30.706 "num_base_bdevs_operational": 2, 00:20:30.706 "base_bdevs_list": [ 00:20:30.706 { 00:20:30.706 "name": "spare", 00:20:30.706 "uuid": "ac3fe9a7-04dd-5e37-a187-7ec6275eb4b7", 00:20:30.706 "is_configured": true, 00:20:30.706 "data_offset": 0, 00:20:30.706 "data_size": 65536 00:20:30.706 }, 00:20:30.706 { 00:20:30.706 "name": "BaseBdev2", 00:20:30.706 "uuid": "f4df8e8d-8df7-568a-a957-62adf0e80cee", 00:20:30.706 "is_configured": true, 00:20:30.706 "data_offset": 0, 00:20:30.706 "data_size": 65536 00:20:30.706 } 00:20:30.706 ] 00:20:30.706 }' 00:20:30.706 17:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:30.706 17:32:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:31.274 17:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:31.274 [2024-07-15 17:32:42.567020] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:31.274 [2024-07-15 17:32:42.567039] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:31.274 [2024-07-15 17:32:42.567081] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:31.274 [2024-07-15 17:32:42.567122] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:31.274 [2024-07-15 17:32:42.567133] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b7f390 name raid_bdev1, state offline 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:31.533 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:31.792 /dev/nbd0 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:31.792 1+0 records in 00:20:31.792 1+0 records out 00:20:31.792 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280479 s, 14.6 MB/s 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:31.792 17:32:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:32.052 /dev/nbd1 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:32.052 1+0 records in 00:20:32.052 1+0 records out 00:20:32.052 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292004 s, 14.0 MB/s 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:32.052 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:32.311 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:32.311 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:32.311 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:32.311 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:32.311 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:32.311 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:32.311 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:32.311 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:32.311 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:32.312 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2855333 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2855333 ']' 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2855333 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2855333 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2855333' 00:20:32.571 killing process with pid 2855333 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2855333 00:20:32.571 Received shutdown signal, test time was about 60.000000 seconds 00:20:32.571 00:20:32.571 Latency(us) 00:20:32.571 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:32.571 =================================================================================================================== 00:20:32.571 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:32.571 [2024-07-15 17:32:43.728233] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2855333 00:20:32.571 [2024-07-15 17:32:43.742673] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:20:32.571 00:20:32.571 real 0m18.846s 00:20:32.571 user 0m26.366s 00:20:32.571 sys 0m2.953s 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:32.571 17:32:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.571 ************************************ 00:20:32.571 END TEST raid_rebuild_test 00:20:32.571 ************************************ 00:20:32.831 17:32:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:32.831 17:32:43 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:20:32.831 17:32:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:32.831 17:32:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:32.831 17:32:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:32.831 ************************************ 00:20:32.831 START TEST raid_rebuild_test_sb 00:20:32.831 ************************************ 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:32.831 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2858659 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2858659 /var/tmp/spdk-raid.sock 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2858659 ']' 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:32.832 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:32.832 17:32:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:32.832 [2024-07-15 17:32:43.997641] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:20:32.832 [2024-07-15 17:32:43.997695] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2858659 ] 00:20:32.832 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:32.832 Zero copy mechanism will not be used. 00:20:32.832 [2024-07-15 17:32:44.089220] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:33.092 [2024-07-15 17:32:44.156919] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:33.092 [2024-07-15 17:32:44.198861] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:33.092 [2024-07-15 17:32:44.198886] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:33.663 17:32:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:33.663 17:32:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:33.663 17:32:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:33.663 17:32:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:33.923 BaseBdev1_malloc 00:20:33.924 17:32:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:33.924 [2024-07-15 17:32:45.192873] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:33.924 [2024-07-15 17:32:45.192905] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:33.924 [2024-07-15 17:32:45.192921] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14fad30 00:20:33.924 [2024-07-15 17:32:45.192928] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:33.924 [2024-07-15 17:32:45.194202] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:33.924 [2024-07-15 17:32:45.194221] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:33.924 BaseBdev1 00:20:33.924 17:32:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:33.924 17:32:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:34.184 BaseBdev2_malloc 00:20:34.184 17:32:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:34.444 [2024-07-15 17:32:45.563795] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:34.444 [2024-07-15 17:32:45.563825] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.444 [2024-07-15 17:32:45.563836] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16adc60 00:20:34.444 [2024-07-15 17:32:45.563841] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.444 [2024-07-15 17:32:45.565042] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.444 [2024-07-15 17:32:45.565061] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:34.444 BaseBdev2 00:20:34.444 17:32:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:34.705 spare_malloc 00:20:34.705 17:32:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:34.705 spare_delay 00:20:34.705 17:32:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:34.966 [2024-07-15 17:32:46.119097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:34.966 [2024-07-15 17:32:46.119125] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.966 [2024-07-15 17:32:46.119135] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x169dec0 00:20:34.966 [2024-07-15 17:32:46.119141] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.966 [2024-07-15 17:32:46.120340] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.966 [2024-07-15 17:32:46.120358] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:34.966 spare 00:20:34.966 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:35.227 [2024-07-15 17:32:46.311597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:35.227 [2024-07-15 17:32:46.312584] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:35.227 [2024-07-15 17:32:46.312699] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1695390 00:20:35.227 [2024-07-15 17:32:46.312706] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:35.227 [2024-07-15 17:32:46.312857] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14f17c0 00:20:35.227 [2024-07-15 17:32:46.312967] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1695390 00:20:35.227 [2024-07-15 17:32:46.312972] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1695390 00:20:35.227 [2024-07-15 17:32:46.313043] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:35.227 "name": "raid_bdev1", 00:20:35.227 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:35.227 "strip_size_kb": 0, 00:20:35.227 "state": "online", 00:20:35.227 "raid_level": "raid1", 00:20:35.227 "superblock": true, 00:20:35.227 "num_base_bdevs": 2, 00:20:35.227 "num_base_bdevs_discovered": 2, 00:20:35.227 "num_base_bdevs_operational": 2, 00:20:35.227 "base_bdevs_list": [ 00:20:35.227 { 00:20:35.227 "name": "BaseBdev1", 00:20:35.227 "uuid": "aa5185d0-d975-5984-a6e6-e1ed0ba06fd6", 00:20:35.227 "is_configured": true, 00:20:35.227 "data_offset": 2048, 00:20:35.227 "data_size": 63488 00:20:35.227 }, 00:20:35.227 { 00:20:35.227 "name": "BaseBdev2", 00:20:35.227 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:35.227 "is_configured": true, 00:20:35.227 "data_offset": 2048, 00:20:35.227 "data_size": 63488 00:20:35.227 } 00:20:35.227 ] 00:20:35.227 }' 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:35.227 17:32:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:35.796 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:35.796 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:36.056 [2024-07-15 17:32:47.230105] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:36.056 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:20:36.056 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.056 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:36.316 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:36.316 [2024-07-15 17:32:47.602869] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1693b00 00:20:36.316 /dev/nbd0 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:36.576 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:36.576 1+0 records in 00:20:36.577 1+0 records out 00:20:36.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000147084 s, 27.8 MB/s 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:20:36.577 17:32:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:20:41.878 63488+0 records in 00:20:41.878 63488+0 records out 00:20:41.878 32505856 bytes (33 MB, 31 MiB) copied, 5.29737 s, 6.1 MB/s 00:20:41.878 17:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:41.878 17:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:41.878 17:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:41.878 17:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:41.878 17:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:41.878 17:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:41.878 17:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:41.878 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:41.878 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:41.879 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:41.879 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:41.879 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:41.879 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:41.879 [2024-07-15 17:32:53.146703] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:41.879 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:41.879 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:41.879 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:42.138 [2024-07-15 17:32:53.324613] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.138 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.398 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.398 "name": "raid_bdev1", 00:20:42.398 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:42.398 "strip_size_kb": 0, 00:20:42.398 "state": "online", 00:20:42.398 "raid_level": "raid1", 00:20:42.398 "superblock": true, 00:20:42.398 "num_base_bdevs": 2, 00:20:42.398 "num_base_bdevs_discovered": 1, 00:20:42.398 "num_base_bdevs_operational": 1, 00:20:42.398 "base_bdevs_list": [ 00:20:42.398 { 00:20:42.398 "name": null, 00:20:42.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:42.398 "is_configured": false, 00:20:42.398 "data_offset": 2048, 00:20:42.398 "data_size": 63488 00:20:42.398 }, 00:20:42.398 { 00:20:42.398 "name": "BaseBdev2", 00:20:42.398 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:42.398 "is_configured": true, 00:20:42.398 "data_offset": 2048, 00:20:42.398 "data_size": 63488 00:20:42.398 } 00:20:42.398 ] 00:20:42.398 }' 00:20:42.398 17:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.398 17:32:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:42.967 17:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:43.227 [2024-07-15 17:32:54.283046] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:43.227 [2024-07-15 17:32:54.286361] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1693aa0 00:20:43.227 [2024-07-15 17:32:54.287931] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:43.227 17:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:44.166 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:44.166 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:44.166 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:44.166 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:44.166 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:44.166 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.166 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.426 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:44.426 "name": "raid_bdev1", 00:20:44.426 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:44.426 "strip_size_kb": 0, 00:20:44.426 "state": "online", 00:20:44.426 "raid_level": "raid1", 00:20:44.426 "superblock": true, 00:20:44.426 "num_base_bdevs": 2, 00:20:44.426 "num_base_bdevs_discovered": 2, 00:20:44.426 "num_base_bdevs_operational": 2, 00:20:44.426 "process": { 00:20:44.426 "type": "rebuild", 00:20:44.426 "target": "spare", 00:20:44.426 "progress": { 00:20:44.426 "blocks": 22528, 00:20:44.426 "percent": 35 00:20:44.426 } 00:20:44.426 }, 00:20:44.426 "base_bdevs_list": [ 00:20:44.426 { 00:20:44.426 "name": "spare", 00:20:44.426 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:44.426 "is_configured": true, 00:20:44.426 "data_offset": 2048, 00:20:44.426 "data_size": 63488 00:20:44.426 }, 00:20:44.426 { 00:20:44.426 "name": "BaseBdev2", 00:20:44.426 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:44.426 "is_configured": true, 00:20:44.426 "data_offset": 2048, 00:20:44.426 "data_size": 63488 00:20:44.426 } 00:20:44.426 ] 00:20:44.426 }' 00:20:44.426 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:44.426 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:44.426 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:44.426 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:44.426 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:44.687 [2024-07-15 17:32:55.740361] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:44.687 [2024-07-15 17:32:55.796794] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:44.687 [2024-07-15 17:32:55.796828] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:44.687 [2024-07-15 17:32:55.796838] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:44.687 [2024-07-15 17:32:55.796842] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:44.687 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:44.687 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:44.687 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:44.687 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:44.687 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:44.687 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:44.688 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.688 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.688 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.688 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.688 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.688 17:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.948 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.948 "name": "raid_bdev1", 00:20:44.948 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:44.948 "strip_size_kb": 0, 00:20:44.948 "state": "online", 00:20:44.948 "raid_level": "raid1", 00:20:44.948 "superblock": true, 00:20:44.948 "num_base_bdevs": 2, 00:20:44.948 "num_base_bdevs_discovered": 1, 00:20:44.948 "num_base_bdevs_operational": 1, 00:20:44.948 "base_bdevs_list": [ 00:20:44.948 { 00:20:44.948 "name": null, 00:20:44.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.948 "is_configured": false, 00:20:44.948 "data_offset": 2048, 00:20:44.948 "data_size": 63488 00:20:44.948 }, 00:20:44.948 { 00:20:44.948 "name": "BaseBdev2", 00:20:44.948 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:44.948 "is_configured": true, 00:20:44.948 "data_offset": 2048, 00:20:44.948 "data_size": 63488 00:20:44.948 } 00:20:44.948 ] 00:20:44.948 }' 00:20:44.948 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.948 17:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:45.520 "name": "raid_bdev1", 00:20:45.520 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:45.520 "strip_size_kb": 0, 00:20:45.520 "state": "online", 00:20:45.520 "raid_level": "raid1", 00:20:45.520 "superblock": true, 00:20:45.520 "num_base_bdevs": 2, 00:20:45.520 "num_base_bdevs_discovered": 1, 00:20:45.520 "num_base_bdevs_operational": 1, 00:20:45.520 "base_bdevs_list": [ 00:20:45.520 { 00:20:45.520 "name": null, 00:20:45.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.520 "is_configured": false, 00:20:45.520 "data_offset": 2048, 00:20:45.520 "data_size": 63488 00:20:45.520 }, 00:20:45.520 { 00:20:45.520 "name": "BaseBdev2", 00:20:45.520 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:45.520 "is_configured": true, 00:20:45.520 "data_offset": 2048, 00:20:45.520 "data_size": 63488 00:20:45.520 } 00:20:45.520 ] 00:20:45.520 }' 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:45.520 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:45.780 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:45.780 17:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:45.780 [2024-07-15 17:32:57.015682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:45.780 [2024-07-15 17:32:57.019123] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16939e0 00:20:45.780 [2024-07-15 17:32:57.020260] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:45.780 17:32:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:47.213 "name": "raid_bdev1", 00:20:47.213 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:47.213 "strip_size_kb": 0, 00:20:47.213 "state": "online", 00:20:47.213 "raid_level": "raid1", 00:20:47.213 "superblock": true, 00:20:47.213 "num_base_bdevs": 2, 00:20:47.213 "num_base_bdevs_discovered": 2, 00:20:47.213 "num_base_bdevs_operational": 2, 00:20:47.213 "process": { 00:20:47.213 "type": "rebuild", 00:20:47.213 "target": "spare", 00:20:47.213 "progress": { 00:20:47.213 "blocks": 22528, 00:20:47.213 "percent": 35 00:20:47.213 } 00:20:47.213 }, 00:20:47.213 "base_bdevs_list": [ 00:20:47.213 { 00:20:47.213 "name": "spare", 00:20:47.213 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:47.213 "is_configured": true, 00:20:47.213 "data_offset": 2048, 00:20:47.213 "data_size": 63488 00:20:47.213 }, 00:20:47.213 { 00:20:47.213 "name": "BaseBdev2", 00:20:47.213 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:47.213 "is_configured": true, 00:20:47.213 "data_offset": 2048, 00:20:47.213 "data_size": 63488 00:20:47.213 } 00:20:47.213 ] 00:20:47.213 }' 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:20:47.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=672 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.213 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.474 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:47.474 "name": "raid_bdev1", 00:20:47.474 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:47.474 "strip_size_kb": 0, 00:20:47.474 "state": "online", 00:20:47.474 "raid_level": "raid1", 00:20:47.474 "superblock": true, 00:20:47.474 "num_base_bdevs": 2, 00:20:47.474 "num_base_bdevs_discovered": 2, 00:20:47.474 "num_base_bdevs_operational": 2, 00:20:47.474 "process": { 00:20:47.474 "type": "rebuild", 00:20:47.474 "target": "spare", 00:20:47.474 "progress": { 00:20:47.474 "blocks": 28672, 00:20:47.474 "percent": 45 00:20:47.474 } 00:20:47.474 }, 00:20:47.474 "base_bdevs_list": [ 00:20:47.474 { 00:20:47.474 "name": "spare", 00:20:47.474 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:47.474 "is_configured": true, 00:20:47.474 "data_offset": 2048, 00:20:47.474 "data_size": 63488 00:20:47.474 }, 00:20:47.474 { 00:20:47.474 "name": "BaseBdev2", 00:20:47.474 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:47.474 "is_configured": true, 00:20:47.474 "data_offset": 2048, 00:20:47.474 "data_size": 63488 00:20:47.474 } 00:20:47.474 ] 00:20:47.474 }' 00:20:47.474 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:47.474 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:47.474 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:47.474 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:47.474 17:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:48.415 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:48.415 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:48.415 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:48.415 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:48.415 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:48.415 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:48.415 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.415 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:48.675 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:48.675 "name": "raid_bdev1", 00:20:48.675 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:48.675 "strip_size_kb": 0, 00:20:48.675 "state": "online", 00:20:48.675 "raid_level": "raid1", 00:20:48.675 "superblock": true, 00:20:48.675 "num_base_bdevs": 2, 00:20:48.675 "num_base_bdevs_discovered": 2, 00:20:48.675 "num_base_bdevs_operational": 2, 00:20:48.675 "process": { 00:20:48.675 "type": "rebuild", 00:20:48.675 "target": "spare", 00:20:48.675 "progress": { 00:20:48.675 "blocks": 55296, 00:20:48.675 "percent": 87 00:20:48.675 } 00:20:48.675 }, 00:20:48.675 "base_bdevs_list": [ 00:20:48.675 { 00:20:48.675 "name": "spare", 00:20:48.675 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:48.675 "is_configured": true, 00:20:48.675 "data_offset": 2048, 00:20:48.675 "data_size": 63488 00:20:48.675 }, 00:20:48.675 { 00:20:48.675 "name": "BaseBdev2", 00:20:48.675 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:48.675 "is_configured": true, 00:20:48.675 "data_offset": 2048, 00:20:48.675 "data_size": 63488 00:20:48.675 } 00:20:48.675 ] 00:20:48.675 }' 00:20:48.675 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:48.675 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:48.675 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:48.675 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:48.675 17:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:48.935 [2024-07-15 17:33:00.138401] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:48.935 [2024-07-15 17:33:00.138454] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:48.935 [2024-07-15 17:33:00.138519] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:49.874 17:33:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:49.874 17:33:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:49.874 17:33:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:49.874 17:33:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:49.874 17:33:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:49.874 17:33:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:49.874 17:33:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:49.874 17:33:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.874 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:49.874 "name": "raid_bdev1", 00:20:49.874 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:49.874 "strip_size_kb": 0, 00:20:49.874 "state": "online", 00:20:49.874 "raid_level": "raid1", 00:20:49.874 "superblock": true, 00:20:49.874 "num_base_bdevs": 2, 00:20:49.874 "num_base_bdevs_discovered": 2, 00:20:49.874 "num_base_bdevs_operational": 2, 00:20:49.874 "base_bdevs_list": [ 00:20:49.874 { 00:20:49.874 "name": "spare", 00:20:49.874 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:49.874 "is_configured": true, 00:20:49.874 "data_offset": 2048, 00:20:49.874 "data_size": 63488 00:20:49.874 }, 00:20:49.874 { 00:20:49.874 "name": "BaseBdev2", 00:20:49.874 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:49.874 "is_configured": true, 00:20:49.874 "data_offset": 2048, 00:20:49.874 "data_size": 63488 00:20:49.874 } 00:20:49.874 ] 00:20:49.874 }' 00:20:49.874 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:49.874 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:49.874 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.133 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:50.133 "name": "raid_bdev1", 00:20:50.133 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:50.133 "strip_size_kb": 0, 00:20:50.133 "state": "online", 00:20:50.133 "raid_level": "raid1", 00:20:50.133 "superblock": true, 00:20:50.133 "num_base_bdevs": 2, 00:20:50.133 "num_base_bdevs_discovered": 2, 00:20:50.133 "num_base_bdevs_operational": 2, 00:20:50.133 "base_bdevs_list": [ 00:20:50.133 { 00:20:50.133 "name": "spare", 00:20:50.133 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:50.133 "is_configured": true, 00:20:50.133 "data_offset": 2048, 00:20:50.133 "data_size": 63488 00:20:50.133 }, 00:20:50.133 { 00:20:50.133 "name": "BaseBdev2", 00:20:50.133 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:50.133 "is_configured": true, 00:20:50.134 "data_offset": 2048, 00:20:50.134 "data_size": 63488 00:20:50.134 } 00:20:50.134 ] 00:20:50.134 }' 00:20:50.134 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:50.134 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.393 "name": "raid_bdev1", 00:20:50.393 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:50.393 "strip_size_kb": 0, 00:20:50.393 "state": "online", 00:20:50.393 "raid_level": "raid1", 00:20:50.393 "superblock": true, 00:20:50.393 "num_base_bdevs": 2, 00:20:50.393 "num_base_bdevs_discovered": 2, 00:20:50.393 "num_base_bdevs_operational": 2, 00:20:50.393 "base_bdevs_list": [ 00:20:50.393 { 00:20:50.393 "name": "spare", 00:20:50.393 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:50.393 "is_configured": true, 00:20:50.393 "data_offset": 2048, 00:20:50.393 "data_size": 63488 00:20:50.393 }, 00:20:50.393 { 00:20:50.393 "name": "BaseBdev2", 00:20:50.393 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:50.393 "is_configured": true, 00:20:50.393 "data_offset": 2048, 00:20:50.393 "data_size": 63488 00:20:50.393 } 00:20:50.393 ] 00:20:50.393 }' 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.393 17:33:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:50.961 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:51.220 [2024-07-15 17:33:02.364225] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:51.220 [2024-07-15 17:33:02.364243] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:51.220 [2024-07-15 17:33:02.364284] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:51.220 [2024-07-15 17:33:02.364322] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:51.220 [2024-07-15 17:33:02.364328] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1695390 name raid_bdev1, state offline 00:20:51.220 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.220 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:51.479 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:51.479 /dev/nbd0 00:20:51.739 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:51.739 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:51.739 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:51.740 1+0 records in 00:20:51.740 1+0 records out 00:20:51.740 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246667 s, 16.6 MB/s 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:51.740 17:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:51.740 /dev/nbd1 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:51.740 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:51.740 1+0 records in 00:20:51.740 1+0 records out 00:20:51.740 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284741 s, 14.4 MB/s 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:52.000 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:20:52.262 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:52.521 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:52.782 [2024-07-15 17:33:03.885905] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:52.782 [2024-07-15 17:33:03.885936] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:52.782 [2024-07-15 17:33:03.885949] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1693d60 00:20:52.782 [2024-07-15 17:33:03.885955] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:52.782 [2024-07-15 17:33:03.887322] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:52.782 [2024-07-15 17:33:03.887344] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:52.782 [2024-07-15 17:33:03.887400] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:52.782 [2024-07-15 17:33:03.887419] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:52.782 [2024-07-15 17:33:03.887497] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:52.782 spare 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:52.782 17:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.782 [2024-07-15 17:33:03.987787] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14f1920 00:20:52.782 [2024-07-15 17:33:03.987798] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:52.782 [2024-07-15 17:33:03.987954] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169e150 00:20:52.782 [2024-07-15 17:33:03.988069] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14f1920 00:20:52.782 [2024-07-15 17:33:03.988075] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14f1920 00:20:52.782 [2024-07-15 17:33:03.988151] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:53.043 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.043 "name": "raid_bdev1", 00:20:53.043 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:53.043 "strip_size_kb": 0, 00:20:53.043 "state": "online", 00:20:53.043 "raid_level": "raid1", 00:20:53.043 "superblock": true, 00:20:53.043 "num_base_bdevs": 2, 00:20:53.043 "num_base_bdevs_discovered": 2, 00:20:53.043 "num_base_bdevs_operational": 2, 00:20:53.043 "base_bdevs_list": [ 00:20:53.043 { 00:20:53.043 "name": "spare", 00:20:53.043 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:53.043 "is_configured": true, 00:20:53.043 "data_offset": 2048, 00:20:53.043 "data_size": 63488 00:20:53.043 }, 00:20:53.043 { 00:20:53.043 "name": "BaseBdev2", 00:20:53.043 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:53.043 "is_configured": true, 00:20:53.043 "data_offset": 2048, 00:20:53.043 "data_size": 63488 00:20:53.043 } 00:20:53.043 ] 00:20:53.043 }' 00:20:53.043 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.043 17:33:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:53.616 "name": "raid_bdev1", 00:20:53.616 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:53.616 "strip_size_kb": 0, 00:20:53.616 "state": "online", 00:20:53.616 "raid_level": "raid1", 00:20:53.616 "superblock": true, 00:20:53.616 "num_base_bdevs": 2, 00:20:53.616 "num_base_bdevs_discovered": 2, 00:20:53.616 "num_base_bdevs_operational": 2, 00:20:53.616 "base_bdevs_list": [ 00:20:53.616 { 00:20:53.616 "name": "spare", 00:20:53.616 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:53.616 "is_configured": true, 00:20:53.616 "data_offset": 2048, 00:20:53.616 "data_size": 63488 00:20:53.616 }, 00:20:53.616 { 00:20:53.616 "name": "BaseBdev2", 00:20:53.616 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:53.616 "is_configured": true, 00:20:53.616 "data_offset": 2048, 00:20:53.616 "data_size": 63488 00:20:53.616 } 00:20:53.616 ] 00:20:53.616 }' 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:53.616 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:53.880 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:53.880 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.880 17:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:53.880 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:20:53.880 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:54.162 [2024-07-15 17:33:05.269483] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.162 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:54.424 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.424 "name": "raid_bdev1", 00:20:54.424 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:54.424 "strip_size_kb": 0, 00:20:54.424 "state": "online", 00:20:54.424 "raid_level": "raid1", 00:20:54.424 "superblock": true, 00:20:54.424 "num_base_bdevs": 2, 00:20:54.424 "num_base_bdevs_discovered": 1, 00:20:54.424 "num_base_bdevs_operational": 1, 00:20:54.424 "base_bdevs_list": [ 00:20:54.424 { 00:20:54.424 "name": null, 00:20:54.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.424 "is_configured": false, 00:20:54.424 "data_offset": 2048, 00:20:54.424 "data_size": 63488 00:20:54.424 }, 00:20:54.424 { 00:20:54.424 "name": "BaseBdev2", 00:20:54.424 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:54.424 "is_configured": true, 00:20:54.424 "data_offset": 2048, 00:20:54.424 "data_size": 63488 00:20:54.424 } 00:20:54.424 ] 00:20:54.424 }' 00:20:54.424 17:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.424 17:33:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:54.996 17:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:54.996 [2024-07-15 17:33:06.195837] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:54.996 [2024-07-15 17:33:06.195933] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:54.996 [2024-07-15 17:33:06.195942] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:54.996 [2024-07-15 17:33:06.195961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:54.996 [2024-07-15 17:33:06.199103] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14f2900 00:20:54.996 [2024-07-15 17:33:06.200658] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:54.996 17:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:20:55.939 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:55.939 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:55.939 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:55.939 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:55.939 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:55.939 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.939 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:56.200 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:56.200 "name": "raid_bdev1", 00:20:56.200 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:56.200 "strip_size_kb": 0, 00:20:56.200 "state": "online", 00:20:56.200 "raid_level": "raid1", 00:20:56.200 "superblock": true, 00:20:56.200 "num_base_bdevs": 2, 00:20:56.200 "num_base_bdevs_discovered": 2, 00:20:56.200 "num_base_bdevs_operational": 2, 00:20:56.200 "process": { 00:20:56.200 "type": "rebuild", 00:20:56.200 "target": "spare", 00:20:56.200 "progress": { 00:20:56.200 "blocks": 22528, 00:20:56.200 "percent": 35 00:20:56.200 } 00:20:56.200 }, 00:20:56.200 "base_bdevs_list": [ 00:20:56.200 { 00:20:56.200 "name": "spare", 00:20:56.200 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:56.200 "is_configured": true, 00:20:56.200 "data_offset": 2048, 00:20:56.200 "data_size": 63488 00:20:56.200 }, 00:20:56.200 { 00:20:56.200 "name": "BaseBdev2", 00:20:56.200 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:56.200 "is_configured": true, 00:20:56.200 "data_offset": 2048, 00:20:56.200 "data_size": 63488 00:20:56.200 } 00:20:56.200 ] 00:20:56.200 }' 00:20:56.200 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:56.200 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:56.200 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:56.461 [2024-07-15 17:33:07.681442] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:56.461 [2024-07-15 17:33:07.709480] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:56.461 [2024-07-15 17:33:07.709511] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:56.461 [2024-07-15 17:33:07.709520] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:56.461 [2024-07-15 17:33:07.709525] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.461 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:56.722 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:56.722 "name": "raid_bdev1", 00:20:56.722 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:56.722 "strip_size_kb": 0, 00:20:56.722 "state": "online", 00:20:56.722 "raid_level": "raid1", 00:20:56.722 "superblock": true, 00:20:56.722 "num_base_bdevs": 2, 00:20:56.722 "num_base_bdevs_discovered": 1, 00:20:56.722 "num_base_bdevs_operational": 1, 00:20:56.722 "base_bdevs_list": [ 00:20:56.722 { 00:20:56.722 "name": null, 00:20:56.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.722 "is_configured": false, 00:20:56.722 "data_offset": 2048, 00:20:56.722 "data_size": 63488 00:20:56.722 }, 00:20:56.722 { 00:20:56.722 "name": "BaseBdev2", 00:20:56.722 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:56.722 "is_configured": true, 00:20:56.722 "data_offset": 2048, 00:20:56.722 "data_size": 63488 00:20:56.722 } 00:20:56.722 ] 00:20:56.722 }' 00:20:56.722 17:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:56.722 17:33:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:57.294 17:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:57.554 [2024-07-15 17:33:08.659882] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:57.554 [2024-07-15 17:33:08.659917] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:57.554 [2024-07-15 17:33:08.659931] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14f5410 00:20:57.554 [2024-07-15 17:33:08.659939] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:57.554 [2024-07-15 17:33:08.660243] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:57.554 [2024-07-15 17:33:08.660255] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:57.554 [2024-07-15 17:33:08.660309] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:57.554 [2024-07-15 17:33:08.660317] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:57.554 [2024-07-15 17:33:08.660322] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:57.554 [2024-07-15 17:33:08.660333] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:57.554 [2024-07-15 17:33:08.663685] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14f18f0 00:20:57.554 [2024-07-15 17:33:08.664823] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:57.554 spare 00:20:57.554 17:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:58.495 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:58.495 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:58.495 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:58.495 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:58.495 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:58.495 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.495 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.756 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:58.756 "name": "raid_bdev1", 00:20:58.756 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:58.756 "strip_size_kb": 0, 00:20:58.756 "state": "online", 00:20:58.756 "raid_level": "raid1", 00:20:58.756 "superblock": true, 00:20:58.756 "num_base_bdevs": 2, 00:20:58.756 "num_base_bdevs_discovered": 2, 00:20:58.756 "num_base_bdevs_operational": 2, 00:20:58.756 "process": { 00:20:58.756 "type": "rebuild", 00:20:58.756 "target": "spare", 00:20:58.756 "progress": { 00:20:58.756 "blocks": 24576, 00:20:58.757 "percent": 38 00:20:58.757 } 00:20:58.757 }, 00:20:58.757 "base_bdevs_list": [ 00:20:58.757 { 00:20:58.757 "name": "spare", 00:20:58.757 "uuid": "3c31acb8-e2bc-5795-94ca-5348f7280bd4", 00:20:58.757 "is_configured": true, 00:20:58.757 "data_offset": 2048, 00:20:58.757 "data_size": 63488 00:20:58.757 }, 00:20:58.757 { 00:20:58.757 "name": "BaseBdev2", 00:20:58.757 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:58.757 "is_configured": true, 00:20:58.757 "data_offset": 2048, 00:20:58.757 "data_size": 63488 00:20:58.757 } 00:20:58.757 ] 00:20:58.757 }' 00:20:58.757 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:58.757 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:58.757 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:58.757 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:58.757 17:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:59.018 [2024-07-15 17:33:10.165411] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:59.018 [2024-07-15 17:33:10.173700] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:59.018 [2024-07-15 17:33:10.173735] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:59.018 [2024-07-15 17:33:10.173750] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:59.018 [2024-07-15 17:33:10.173755] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:59.018 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:59.018 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:59.018 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:59.018 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:59.018 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:59.018 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:59.018 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.018 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.018 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.019 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.019 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.019 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.278 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.278 "name": "raid_bdev1", 00:20:59.278 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:59.278 "strip_size_kb": 0, 00:20:59.278 "state": "online", 00:20:59.278 "raid_level": "raid1", 00:20:59.278 "superblock": true, 00:20:59.278 "num_base_bdevs": 2, 00:20:59.278 "num_base_bdevs_discovered": 1, 00:20:59.278 "num_base_bdevs_operational": 1, 00:20:59.278 "base_bdevs_list": [ 00:20:59.278 { 00:20:59.278 "name": null, 00:20:59.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.278 "is_configured": false, 00:20:59.278 "data_offset": 2048, 00:20:59.278 "data_size": 63488 00:20:59.278 }, 00:20:59.278 { 00:20:59.278 "name": "BaseBdev2", 00:20:59.278 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:59.278 "is_configured": true, 00:20:59.278 "data_offset": 2048, 00:20:59.278 "data_size": 63488 00:20:59.278 } 00:20:59.278 ] 00:20:59.278 }' 00:20:59.278 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.278 17:33:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:59.849 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:59.849 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:59.849 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:59.849 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:59.849 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:59.849 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.849 17:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.849 17:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:59.849 "name": "raid_bdev1", 00:20:59.849 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:20:59.849 "strip_size_kb": 0, 00:20:59.849 "state": "online", 00:20:59.849 "raid_level": "raid1", 00:20:59.849 "superblock": true, 00:20:59.849 "num_base_bdevs": 2, 00:20:59.849 "num_base_bdevs_discovered": 1, 00:20:59.849 "num_base_bdevs_operational": 1, 00:20:59.849 "base_bdevs_list": [ 00:20:59.849 { 00:20:59.849 "name": null, 00:20:59.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.849 "is_configured": false, 00:20:59.849 "data_offset": 2048, 00:20:59.849 "data_size": 63488 00:20:59.849 }, 00:20:59.849 { 00:20:59.849 "name": "BaseBdev2", 00:20:59.849 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:20:59.849 "is_configured": true, 00:20:59.849 "data_offset": 2048, 00:20:59.849 "data_size": 63488 00:20:59.849 } 00:20:59.849 ] 00:20:59.849 }' 00:20:59.849 17:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:00.109 17:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:00.109 17:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:00.109 17:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:00.109 17:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:00.110 17:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:00.371 [2024-07-15 17:33:11.573143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:00.371 [2024-07-15 17:33:11.573175] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:00.371 [2024-07-15 17:33:11.573187] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14f9cf0 00:21:00.371 [2024-07-15 17:33:11.573193] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:00.371 [2024-07-15 17:33:11.573464] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:00.371 [2024-07-15 17:33:11.573475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:00.371 [2024-07-15 17:33:11.573518] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:00.371 [2024-07-15 17:33:11.573525] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:00.371 [2024-07-15 17:33:11.573530] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:00.371 BaseBdev1 00:21:00.371 17:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.311 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.570 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.570 "name": "raid_bdev1", 00:21:01.570 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:21:01.570 "strip_size_kb": 0, 00:21:01.570 "state": "online", 00:21:01.570 "raid_level": "raid1", 00:21:01.570 "superblock": true, 00:21:01.570 "num_base_bdevs": 2, 00:21:01.570 "num_base_bdevs_discovered": 1, 00:21:01.570 "num_base_bdevs_operational": 1, 00:21:01.570 "base_bdevs_list": [ 00:21:01.570 { 00:21:01.570 "name": null, 00:21:01.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.570 "is_configured": false, 00:21:01.570 "data_offset": 2048, 00:21:01.570 "data_size": 63488 00:21:01.570 }, 00:21:01.570 { 00:21:01.570 "name": "BaseBdev2", 00:21:01.570 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:21:01.570 "is_configured": true, 00:21:01.570 "data_offset": 2048, 00:21:01.570 "data_size": 63488 00:21:01.570 } 00:21:01.570 ] 00:21:01.570 }' 00:21:01.570 17:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.570 17:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:02.165 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:02.165 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:02.165 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:02.165 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:02.165 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:02.165 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.165 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:02.428 "name": "raid_bdev1", 00:21:02.428 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:21:02.428 "strip_size_kb": 0, 00:21:02.428 "state": "online", 00:21:02.428 "raid_level": "raid1", 00:21:02.428 "superblock": true, 00:21:02.428 "num_base_bdevs": 2, 00:21:02.428 "num_base_bdevs_discovered": 1, 00:21:02.428 "num_base_bdevs_operational": 1, 00:21:02.428 "base_bdevs_list": [ 00:21:02.428 { 00:21:02.428 "name": null, 00:21:02.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.428 "is_configured": false, 00:21:02.428 "data_offset": 2048, 00:21:02.428 "data_size": 63488 00:21:02.428 }, 00:21:02.428 { 00:21:02.428 "name": "BaseBdev2", 00:21:02.428 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:21:02.428 "is_configured": true, 00:21:02.428 "data_offset": 2048, 00:21:02.428 "data_size": 63488 00:21:02.428 } 00:21:02.428 ] 00:21:02.428 }' 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:02.428 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:02.689 [2024-07-15 17:33:13.794776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:02.689 [2024-07-15 17:33:13.794864] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:02.689 [2024-07-15 17:33:13.794872] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:02.689 request: 00:21:02.689 { 00:21:02.689 "base_bdev": "BaseBdev1", 00:21:02.689 "raid_bdev": "raid_bdev1", 00:21:02.689 "method": "bdev_raid_add_base_bdev", 00:21:02.689 "req_id": 1 00:21:02.689 } 00:21:02.689 Got JSON-RPC error response 00:21:02.689 response: 00:21:02.689 { 00:21:02.689 "code": -22, 00:21:02.689 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:02.689 } 00:21:02.689 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:21:02.689 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:02.689 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:02.689 17:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:02.689 17:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.632 17:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:03.893 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.893 "name": "raid_bdev1", 00:21:03.893 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:21:03.893 "strip_size_kb": 0, 00:21:03.893 "state": "online", 00:21:03.893 "raid_level": "raid1", 00:21:03.893 "superblock": true, 00:21:03.893 "num_base_bdevs": 2, 00:21:03.893 "num_base_bdevs_discovered": 1, 00:21:03.893 "num_base_bdevs_operational": 1, 00:21:03.893 "base_bdevs_list": [ 00:21:03.893 { 00:21:03.893 "name": null, 00:21:03.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.893 "is_configured": false, 00:21:03.893 "data_offset": 2048, 00:21:03.893 "data_size": 63488 00:21:03.893 }, 00:21:03.893 { 00:21:03.893 "name": "BaseBdev2", 00:21:03.893 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:21:03.893 "is_configured": true, 00:21:03.893 "data_offset": 2048, 00:21:03.893 "data_size": 63488 00:21:03.893 } 00:21:03.893 ] 00:21:03.893 }' 00:21:03.893 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.893 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:04.464 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:04.464 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:04.464 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:04.464 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:04.464 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:04.464 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.464 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:04.724 "name": "raid_bdev1", 00:21:04.724 "uuid": "c8825b7b-650b-4b9d-bcbc-9f6131357184", 00:21:04.724 "strip_size_kb": 0, 00:21:04.724 "state": "online", 00:21:04.724 "raid_level": "raid1", 00:21:04.724 "superblock": true, 00:21:04.724 "num_base_bdevs": 2, 00:21:04.724 "num_base_bdevs_discovered": 1, 00:21:04.724 "num_base_bdevs_operational": 1, 00:21:04.724 "base_bdevs_list": [ 00:21:04.724 { 00:21:04.724 "name": null, 00:21:04.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.724 "is_configured": false, 00:21:04.724 "data_offset": 2048, 00:21:04.724 "data_size": 63488 00:21:04.724 }, 00:21:04.724 { 00:21:04.724 "name": "BaseBdev2", 00:21:04.724 "uuid": "498edf1a-5b01-50ee-86b0-3c1227f4b7fe", 00:21:04.724 "is_configured": true, 00:21:04.724 "data_offset": 2048, 00:21:04.724 "data_size": 63488 00:21:04.724 } 00:21:04.724 ] 00:21:04.724 }' 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2858659 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2858659 ']' 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2858659 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:04.724 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2858659 00:21:04.725 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:04.725 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:04.725 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2858659' 00:21:04.725 killing process with pid 2858659 00:21:04.725 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2858659 00:21:04.725 Received shutdown signal, test time was about 60.000000 seconds 00:21:04.725 00:21:04.725 Latency(us) 00:21:04.725 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:04.725 =================================================================================================================== 00:21:04.725 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:04.725 [2024-07-15 17:33:15.914181] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:04.725 [2024-07-15 17:33:15.914246] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:04.725 [2024-07-15 17:33:15.914276] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:04.725 [2024-07-15 17:33:15.914282] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14f1920 name raid_bdev1, state offline 00:21:04.725 17:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2858659 00:21:04.725 [2024-07-15 17:33:15.929299] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:21:04.986 00:21:04.986 real 0m32.112s 00:21:04.986 user 0m46.770s 00:21:04.986 sys 0m4.682s 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:04.986 ************************************ 00:21:04.986 END TEST raid_rebuild_test_sb 00:21:04.986 ************************************ 00:21:04.986 17:33:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:04.986 17:33:16 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:21:04.986 17:33:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:04.986 17:33:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:04.986 17:33:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:04.986 ************************************ 00:21:04.986 START TEST raid_rebuild_test_io 00:21:04.986 ************************************ 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2865158 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2865158 /var/tmp/spdk-raid.sock 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2865158 ']' 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:04.986 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:04.986 17:33:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:04.986 [2024-07-15 17:33:16.188541] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:21:04.986 [2024-07-15 17:33:16.188592] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2865158 ] 00:21:04.986 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:04.986 Zero copy mechanism will not be used. 00:21:04.986 [2024-07-15 17:33:16.277506] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:05.246 [2024-07-15 17:33:16.345162] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:05.246 [2024-07-15 17:33:16.385421] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:05.246 [2024-07-15 17:33:16.385444] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:06.187 17:33:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:06.187 17:33:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:21:06.187 17:33:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:06.187 17:33:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:06.758 BaseBdev1_malloc 00:21:06.758 17:33:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:07.329 [2024-07-15 17:33:18.426077] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:07.329 [2024-07-15 17:33:18.426115] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.329 [2024-07-15 17:33:18.426129] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cfad30 00:21:07.329 [2024-07-15 17:33:18.426136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.329 [2024-07-15 17:33:18.427448] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.329 [2024-07-15 17:33:18.427468] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:07.329 BaseBdev1 00:21:07.329 17:33:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:07.329 17:33:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:07.900 BaseBdev2_malloc 00:21:07.900 17:33:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:08.471 [2024-07-15 17:33:19.506792] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:08.471 [2024-07-15 17:33:19.506827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:08.471 [2024-07-15 17:33:19.506839] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eadc60 00:21:08.471 [2024-07-15 17:33:19.506846] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:08.472 [2024-07-15 17:33:19.508081] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:08.472 [2024-07-15 17:33:19.508100] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:08.472 BaseBdev2 00:21:08.472 17:33:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:09.043 spare_malloc 00:21:09.043 17:33:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:09.043 spare_delay 00:21:09.043 17:33:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:09.614 [2024-07-15 17:33:20.783947] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:09.614 [2024-07-15 17:33:20.783978] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:09.614 [2024-07-15 17:33:20.783989] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e9dec0 00:21:09.614 [2024-07-15 17:33:20.783996] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:09.614 [2024-07-15 17:33:20.785212] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:09.614 [2024-07-15 17:33:20.785231] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:09.614 spare 00:21:09.614 17:33:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:10.184 [2024-07-15 17:33:21.325311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:10.184 [2024-07-15 17:33:21.326327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:10.184 [2024-07-15 17:33:21.326383] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e95390 00:21:10.184 [2024-07-15 17:33:21.326389] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:10.184 [2024-07-15 17:33:21.326545] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf57f0 00:21:10.184 [2024-07-15 17:33:21.326657] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e95390 00:21:10.184 [2024-07-15 17:33:21.326663] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e95390 00:21:10.184 [2024-07-15 17:33:21.326750] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.184 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:10.753 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.753 "name": "raid_bdev1", 00:21:10.753 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:10.753 "strip_size_kb": 0, 00:21:10.753 "state": "online", 00:21:10.753 "raid_level": "raid1", 00:21:10.753 "superblock": false, 00:21:10.753 "num_base_bdevs": 2, 00:21:10.753 "num_base_bdevs_discovered": 2, 00:21:10.753 "num_base_bdevs_operational": 2, 00:21:10.753 "base_bdevs_list": [ 00:21:10.753 { 00:21:10.753 "name": "BaseBdev1", 00:21:10.753 "uuid": "d68cae16-ec9a-5a24-bc1f-1cc415488beb", 00:21:10.753 "is_configured": true, 00:21:10.753 "data_offset": 0, 00:21:10.753 "data_size": 65536 00:21:10.753 }, 00:21:10.753 { 00:21:10.753 "name": "BaseBdev2", 00:21:10.753 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:10.753 "is_configured": true, 00:21:10.753 "data_offset": 0, 00:21:10.753 "data_size": 65536 00:21:10.753 } 00:21:10.753 ] 00:21:10.753 }' 00:21:10.753 17:33:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.753 17:33:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:11.323 17:33:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:11.323 17:33:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:11.582 [2024-07-15 17:33:22.688954] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:11.582 17:33:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:21:11.582 17:33:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.582 17:33:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:11.851 17:33:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:21:11.851 17:33:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:21:11.851 17:33:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:11.851 17:33:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:11.851 [2024-07-15 17:33:22.990985] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf30d0 00:21:11.851 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:11.851 Zero copy mechanism will not be used. 00:21:11.851 Running I/O for 60 seconds... 00:21:11.852 [2024-07-15 17:33:23.073532] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:11.852 [2024-07-15 17:33:23.079930] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1cf30d0 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.852 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.437 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.437 "name": "raid_bdev1", 00:21:12.437 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:12.437 "strip_size_kb": 0, 00:21:12.437 "state": "online", 00:21:12.437 "raid_level": "raid1", 00:21:12.437 "superblock": false, 00:21:12.437 "num_base_bdevs": 2, 00:21:12.437 "num_base_bdevs_discovered": 1, 00:21:12.437 "num_base_bdevs_operational": 1, 00:21:12.437 "base_bdevs_list": [ 00:21:12.437 { 00:21:12.437 "name": null, 00:21:12.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.437 "is_configured": false, 00:21:12.437 "data_offset": 0, 00:21:12.437 "data_size": 65536 00:21:12.437 }, 00:21:12.437 { 00:21:12.437 "name": "BaseBdev2", 00:21:12.437 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:12.437 "is_configured": true, 00:21:12.437 "data_offset": 0, 00:21:12.437 "data_size": 65536 00:21:12.437 } 00:21:12.437 ] 00:21:12.437 }' 00:21:12.437 17:33:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.437 17:33:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:13.008 17:33:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:13.269 [2024-07-15 17:33:24.419900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:13.269 [2024-07-15 17:33:24.458470] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a00b30 00:21:13.269 [2024-07-15 17:33:24.460071] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:13.269 17:33:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:13.528 [2024-07-15 17:33:24.580870] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:13.528 [2024-07-15 17:33:24.581095] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:13.528 [2024-07-15 17:33:24.788894] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:13.528 [2024-07-15 17:33:24.789002] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:14.098 [2024-07-15 17:33:25.239096] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:14.098 [2024-07-15 17:33:25.239202] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:14.358 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:14.358 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:14.358 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:14.358 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:14.358 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:14.358 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.358 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:14.358 [2024-07-15 17:33:25.561911] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:14.358 [2024-07-15 17:33:25.562150] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:14.619 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:14.619 "name": "raid_bdev1", 00:21:14.619 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:14.619 "strip_size_kb": 0, 00:21:14.619 "state": "online", 00:21:14.619 "raid_level": "raid1", 00:21:14.619 "superblock": false, 00:21:14.619 "num_base_bdevs": 2, 00:21:14.619 "num_base_bdevs_discovered": 2, 00:21:14.619 "num_base_bdevs_operational": 2, 00:21:14.619 "process": { 00:21:14.619 "type": "rebuild", 00:21:14.619 "target": "spare", 00:21:14.619 "progress": { 00:21:14.619 "blocks": 14336, 00:21:14.619 "percent": 21 00:21:14.619 } 00:21:14.619 }, 00:21:14.619 "base_bdevs_list": [ 00:21:14.619 { 00:21:14.619 "name": "spare", 00:21:14.619 "uuid": "b323bc7c-fff5-5aac-bbd9-093fcab3ea73", 00:21:14.619 "is_configured": true, 00:21:14.619 "data_offset": 0, 00:21:14.619 "data_size": 65536 00:21:14.619 }, 00:21:14.619 { 00:21:14.619 "name": "BaseBdev2", 00:21:14.619 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:14.619 "is_configured": true, 00:21:14.619 "data_offset": 0, 00:21:14.619 "data_size": 65536 00:21:14.619 } 00:21:14.619 ] 00:21:14.619 }' 00:21:14.619 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:14.619 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:14.619 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:14.619 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:14.619 17:33:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:14.619 [2024-07-15 17:33:25.763893] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:14.619 [2024-07-15 17:33:25.764010] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:14.879 [2024-07-15 17:33:25.936826] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:14.879 [2024-07-15 17:33:26.074423] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:14.879 [2024-07-15 17:33:26.082090] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:14.879 [2024-07-15 17:33:26.082118] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:14.879 [2024-07-15 17:33:26.082124] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:14.879 [2024-07-15 17:33:26.104966] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1cf30d0 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.879 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:15.139 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.139 "name": "raid_bdev1", 00:21:15.139 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:15.139 "strip_size_kb": 0, 00:21:15.139 "state": "online", 00:21:15.139 "raid_level": "raid1", 00:21:15.139 "superblock": false, 00:21:15.139 "num_base_bdevs": 2, 00:21:15.139 "num_base_bdevs_discovered": 1, 00:21:15.139 "num_base_bdevs_operational": 1, 00:21:15.139 "base_bdevs_list": [ 00:21:15.139 { 00:21:15.139 "name": null, 00:21:15.139 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:15.139 "is_configured": false, 00:21:15.139 "data_offset": 0, 00:21:15.139 "data_size": 65536 00:21:15.139 }, 00:21:15.139 { 00:21:15.139 "name": "BaseBdev2", 00:21:15.139 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:15.139 "is_configured": true, 00:21:15.139 "data_offset": 0, 00:21:15.139 "data_size": 65536 00:21:15.139 } 00:21:15.139 ] 00:21:15.139 }' 00:21:15.139 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.139 17:33:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:15.707 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:15.707 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:15.707 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:15.707 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:15.707 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:15.707 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.707 17:33:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:15.966 17:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:15.966 "name": "raid_bdev1", 00:21:15.966 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:15.966 "strip_size_kb": 0, 00:21:15.966 "state": "online", 00:21:15.966 "raid_level": "raid1", 00:21:15.966 "superblock": false, 00:21:15.966 "num_base_bdevs": 2, 00:21:15.966 "num_base_bdevs_discovered": 1, 00:21:15.966 "num_base_bdevs_operational": 1, 00:21:15.966 "base_bdevs_list": [ 00:21:15.966 { 00:21:15.966 "name": null, 00:21:15.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:15.966 "is_configured": false, 00:21:15.966 "data_offset": 0, 00:21:15.966 "data_size": 65536 00:21:15.966 }, 00:21:15.966 { 00:21:15.966 "name": "BaseBdev2", 00:21:15.966 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:15.966 "is_configured": true, 00:21:15.966 "data_offset": 0, 00:21:15.966 "data_size": 65536 00:21:15.966 } 00:21:15.966 ] 00:21:15.966 }' 00:21:15.966 17:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:15.966 17:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:15.966 17:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:15.966 17:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:15.967 17:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:16.226 [2024-07-15 17:33:27.328677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:16.226 [2024-07-15 17:33:27.386208] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a00b30 00:21:16.226 [2024-07-15 17:33:27.387342] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:16.226 17:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:16.226 [2024-07-15 17:33:27.508023] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:16.226 [2024-07-15 17:33:27.508278] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:16.486 [2024-07-15 17:33:27.723496] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:16.486 [2024-07-15 17:33:27.723602] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:17.058 [2024-07-15 17:33:28.060455] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:17.058 [2024-07-15 17:33:28.195040] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:17.058 [2024-07-15 17:33:28.195144] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:17.355 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:17.355 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:17.355 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:17.355 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:17.355 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:17.355 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.355 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:17.355 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:17.355 "name": "raid_bdev1", 00:21:17.355 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:17.355 "strip_size_kb": 0, 00:21:17.355 "state": "online", 00:21:17.355 "raid_level": "raid1", 00:21:17.355 "superblock": false, 00:21:17.356 "num_base_bdevs": 2, 00:21:17.356 "num_base_bdevs_discovered": 2, 00:21:17.356 "num_base_bdevs_operational": 2, 00:21:17.356 "process": { 00:21:17.356 "type": "rebuild", 00:21:17.356 "target": "spare", 00:21:17.356 "progress": { 00:21:17.356 "blocks": 14336, 00:21:17.356 "percent": 21 00:21:17.356 } 00:21:17.356 }, 00:21:17.356 "base_bdevs_list": [ 00:21:17.356 { 00:21:17.356 "name": "spare", 00:21:17.356 "uuid": "b323bc7c-fff5-5aac-bbd9-093fcab3ea73", 00:21:17.356 "is_configured": true, 00:21:17.356 "data_offset": 0, 00:21:17.356 "data_size": 65536 00:21:17.356 }, 00:21:17.356 { 00:21:17.356 "name": "BaseBdev2", 00:21:17.356 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:17.356 "is_configured": true, 00:21:17.356 "data_offset": 0, 00:21:17.356 "data_size": 65536 00:21:17.356 } 00:21:17.356 ] 00:21:17.356 }' 00:21:17.356 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:17.356 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:17.356 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:17.615 [2024-07-15 17:33:28.673668] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=702 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:17.615 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:17.615 "name": "raid_bdev1", 00:21:17.615 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:17.616 "strip_size_kb": 0, 00:21:17.616 "state": "online", 00:21:17.616 "raid_level": "raid1", 00:21:17.616 "superblock": false, 00:21:17.616 "num_base_bdevs": 2, 00:21:17.616 "num_base_bdevs_discovered": 2, 00:21:17.616 "num_base_bdevs_operational": 2, 00:21:17.616 "process": { 00:21:17.616 "type": "rebuild", 00:21:17.616 "target": "spare", 00:21:17.616 "progress": { 00:21:17.616 "blocks": 16384, 00:21:17.616 "percent": 25 00:21:17.616 } 00:21:17.616 }, 00:21:17.616 "base_bdevs_list": [ 00:21:17.616 { 00:21:17.616 "name": "spare", 00:21:17.616 "uuid": "b323bc7c-fff5-5aac-bbd9-093fcab3ea73", 00:21:17.616 "is_configured": true, 00:21:17.616 "data_offset": 0, 00:21:17.616 "data_size": 65536 00:21:17.616 }, 00:21:17.616 { 00:21:17.616 "name": "BaseBdev2", 00:21:17.616 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:17.616 "is_configured": true, 00:21:17.616 "data_offset": 0, 00:21:17.616 "data_size": 65536 00:21:17.616 } 00:21:17.616 ] 00:21:17.616 }' 00:21:17.616 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:17.875 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:17.875 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:17.875 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:17.875 17:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:17.875 [2024-07-15 17:33:29.004267] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:21:18.135 [2024-07-15 17:33:29.205685] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:18.733 17:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:18.733 17:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:18.733 17:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:18.733 17:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:18.733 17:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:18.733 17:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:18.733 17:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.733 17:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.992 17:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:18.992 "name": "raid_bdev1", 00:21:18.992 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:18.992 "strip_size_kb": 0, 00:21:18.992 "state": "online", 00:21:18.992 "raid_level": "raid1", 00:21:18.992 "superblock": false, 00:21:18.992 "num_base_bdevs": 2, 00:21:18.992 "num_base_bdevs_discovered": 2, 00:21:18.992 "num_base_bdevs_operational": 2, 00:21:18.992 "process": { 00:21:18.992 "type": "rebuild", 00:21:18.992 "target": "spare", 00:21:18.992 "progress": { 00:21:18.992 "blocks": 38912, 00:21:18.992 "percent": 59 00:21:18.992 } 00:21:18.992 }, 00:21:18.992 "base_bdevs_list": [ 00:21:18.992 { 00:21:18.992 "name": "spare", 00:21:18.992 "uuid": "b323bc7c-fff5-5aac-bbd9-093fcab3ea73", 00:21:18.992 "is_configured": true, 00:21:18.992 "data_offset": 0, 00:21:18.992 "data_size": 65536 00:21:18.992 }, 00:21:18.992 { 00:21:18.992 "name": "BaseBdev2", 00:21:18.992 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:18.992 "is_configured": true, 00:21:18.992 "data_offset": 0, 00:21:18.992 "data_size": 65536 00:21:18.992 } 00:21:18.992 ] 00:21:18.992 }' 00:21:18.992 17:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:18.992 [2024-07-15 17:33:30.211060] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:21:18.992 [2024-07-15 17:33:30.211213] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:21:18.992 17:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:18.992 17:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:18.992 17:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:18.992 17:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:20.375 "name": "raid_bdev1", 00:21:20.375 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:20.375 "strip_size_kb": 0, 00:21:20.375 "state": "online", 00:21:20.375 "raid_level": "raid1", 00:21:20.375 "superblock": false, 00:21:20.375 "num_base_bdevs": 2, 00:21:20.375 "num_base_bdevs_discovered": 2, 00:21:20.375 "num_base_bdevs_operational": 2, 00:21:20.375 "process": { 00:21:20.375 "type": "rebuild", 00:21:20.375 "target": "spare", 00:21:20.375 "progress": { 00:21:20.375 "blocks": 59392, 00:21:20.375 "percent": 90 00:21:20.375 } 00:21:20.375 }, 00:21:20.375 "base_bdevs_list": [ 00:21:20.375 { 00:21:20.375 "name": "spare", 00:21:20.375 "uuid": "b323bc7c-fff5-5aac-bbd9-093fcab3ea73", 00:21:20.375 "is_configured": true, 00:21:20.375 "data_offset": 0, 00:21:20.375 "data_size": 65536 00:21:20.375 }, 00:21:20.375 { 00:21:20.375 "name": "BaseBdev2", 00:21:20.375 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:20.375 "is_configured": true, 00:21:20.375 "data_offset": 0, 00:21:20.375 "data_size": 65536 00:21:20.375 } 00:21:20.375 ] 00:21:20.375 }' 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:20.375 17:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:20.375 [2024-07-15 17:33:31.656508] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:20.634 [2024-07-15 17:33:31.753556] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:20.634 [2024-07-15 17:33:31.754742] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:21.573 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:21.573 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:21.573 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:21.573 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:21.573 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:21.573 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:21.573 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.573 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.573 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:21.573 "name": "raid_bdev1", 00:21:21.573 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:21.573 "strip_size_kb": 0, 00:21:21.573 "state": "online", 00:21:21.573 "raid_level": "raid1", 00:21:21.573 "superblock": false, 00:21:21.573 "num_base_bdevs": 2, 00:21:21.573 "num_base_bdevs_discovered": 2, 00:21:21.573 "num_base_bdevs_operational": 2, 00:21:21.573 "base_bdevs_list": [ 00:21:21.573 { 00:21:21.573 "name": "spare", 00:21:21.574 "uuid": "b323bc7c-fff5-5aac-bbd9-093fcab3ea73", 00:21:21.574 "is_configured": true, 00:21:21.574 "data_offset": 0, 00:21:21.574 "data_size": 65536 00:21:21.574 }, 00:21:21.574 { 00:21:21.574 "name": "BaseBdev2", 00:21:21.574 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:21.574 "is_configured": true, 00:21:21.574 "data_offset": 0, 00:21:21.574 "data_size": 65536 00:21:21.574 } 00:21:21.574 ] 00:21:21.574 }' 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.574 17:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.833 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:21.833 "name": "raid_bdev1", 00:21:21.834 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:21.834 "strip_size_kb": 0, 00:21:21.834 "state": "online", 00:21:21.834 "raid_level": "raid1", 00:21:21.834 "superblock": false, 00:21:21.834 "num_base_bdevs": 2, 00:21:21.834 "num_base_bdevs_discovered": 2, 00:21:21.834 "num_base_bdevs_operational": 2, 00:21:21.834 "base_bdevs_list": [ 00:21:21.834 { 00:21:21.834 "name": "spare", 00:21:21.834 "uuid": "b323bc7c-fff5-5aac-bbd9-093fcab3ea73", 00:21:21.834 "is_configured": true, 00:21:21.834 "data_offset": 0, 00:21:21.834 "data_size": 65536 00:21:21.834 }, 00:21:21.834 { 00:21:21.834 "name": "BaseBdev2", 00:21:21.834 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:21.834 "is_configured": true, 00:21:21.834 "data_offset": 0, 00:21:21.834 "data_size": 65536 00:21:21.834 } 00:21:21.834 ] 00:21:21.834 }' 00:21:21.834 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:21.834 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:21.834 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.094 "name": "raid_bdev1", 00:21:22.094 "uuid": "b46aed88-c3a8-4f07-8cde-088fd736352d", 00:21:22.094 "strip_size_kb": 0, 00:21:22.094 "state": "online", 00:21:22.094 "raid_level": "raid1", 00:21:22.094 "superblock": false, 00:21:22.094 "num_base_bdevs": 2, 00:21:22.094 "num_base_bdevs_discovered": 2, 00:21:22.094 "num_base_bdevs_operational": 2, 00:21:22.094 "base_bdevs_list": [ 00:21:22.094 { 00:21:22.094 "name": "spare", 00:21:22.094 "uuid": "b323bc7c-fff5-5aac-bbd9-093fcab3ea73", 00:21:22.094 "is_configured": true, 00:21:22.094 "data_offset": 0, 00:21:22.094 "data_size": 65536 00:21:22.094 }, 00:21:22.094 { 00:21:22.094 "name": "BaseBdev2", 00:21:22.094 "uuid": "11254a92-1553-5880-9bc5-6f1b8119bbcc", 00:21:22.094 "is_configured": true, 00:21:22.094 "data_offset": 0, 00:21:22.094 "data_size": 65536 00:21:22.094 } 00:21:22.094 ] 00:21:22.094 }' 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.094 17:33:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:22.663 17:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:22.922 [2024-07-15 17:33:34.063790] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:22.922 [2024-07-15 17:33:34.063811] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:22.922 00:21:22.922 Latency(us) 00:21:22.922 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:22.922 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:22.922 raid_bdev1 : 11.11 115.49 346.47 0.00 0.00 11640.04 244.18 114536.76 00:21:22.922 =================================================================================================================== 00:21:22.922 Total : 115.49 346.47 0.00 0.00 11640.04 244.18 114536.76 00:21:22.922 [2024-07-15 17:33:34.131229] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:22.922 [2024-07-15 17:33:34.131252] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:22.922 [2024-07-15 17:33:34.131307] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:22.922 [2024-07-15 17:33:34.131313] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e95390 name raid_bdev1, state offline 00:21:22.922 0 00:21:22.922 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.922 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.183 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:23.443 /dev/nbd0 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:23.443 1+0 records in 00:21:23.443 1+0 records out 00:21:23.443 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298797 s, 13.7 MB/s 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.443 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:21:23.704 /dev/nbd1 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:23.704 1+0 records in 00:21:23.704 1+0 records out 00:21:23.704 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276838 s, 14.8 MB/s 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:23.704 17:33:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:23.964 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2865158 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2865158 ']' 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2865158 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2865158 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2865158' 00:21:24.224 killing process with pid 2865158 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2865158 00:21:24.224 Received shutdown signal, test time was about 12.310106 seconds 00:21:24.224 00:21:24.224 Latency(us) 00:21:24.224 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:24.224 =================================================================================================================== 00:21:24.224 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:24.224 [2024-07-15 17:33:35.331687] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2865158 00:21:24.224 [2024-07-15 17:33:35.343325] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:21:24.224 00:21:24.224 real 0m19.337s 00:21:24.224 user 0m30.867s 00:21:24.224 sys 0m2.220s 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:24.224 17:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:24.224 ************************************ 00:21:24.224 END TEST raid_rebuild_test_io 00:21:24.224 ************************************ 00:21:24.224 17:33:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:24.224 17:33:35 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:21:24.224 17:33:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:24.224 17:33:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:24.224 17:33:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:24.485 ************************************ 00:21:24.485 START TEST raid_rebuild_test_sb_io 00:21:24.485 ************************************ 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2868461 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2868461 /var/tmp/spdk-raid.sock 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2868461 ']' 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:24.485 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:24.485 17:33:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:24.485 [2024-07-15 17:33:35.605800] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:21:24.485 [2024-07-15 17:33:35.605846] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2868461 ] 00:21:24.485 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:24.485 Zero copy mechanism will not be used. 00:21:24.485 [2024-07-15 17:33:35.694164] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:24.485 [2024-07-15 17:33:35.758357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:24.746 [2024-07-15 17:33:35.797655] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:24.746 [2024-07-15 17:33:35.797679] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:25.315 17:33:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:25.315 17:33:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:21:25.315 17:33:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:25.315 17:33:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:25.575 BaseBdev1_malloc 00:21:25.575 17:33:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:25.575 [2024-07-15 17:33:36.811738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:25.575 [2024-07-15 17:33:36.811774] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.575 [2024-07-15 17:33:36.811787] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25acd30 00:21:25.575 [2024-07-15 17:33:36.811793] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.575 [2024-07-15 17:33:36.813047] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.575 [2024-07-15 17:33:36.813066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:25.575 BaseBdev1 00:21:25.575 17:33:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:25.575 17:33:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:25.835 BaseBdev2_malloc 00:21:25.835 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:26.095 [2024-07-15 17:33:37.194615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:26.095 [2024-07-15 17:33:37.194640] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.095 [2024-07-15 17:33:37.194650] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x275fc60 00:21:26.095 [2024-07-15 17:33:37.194656] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.095 [2024-07-15 17:33:37.195807] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.095 [2024-07-15 17:33:37.195825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:26.095 BaseBdev2 00:21:26.095 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:26.095 spare_malloc 00:21:26.355 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:26.355 spare_delay 00:21:26.355 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:26.615 [2024-07-15 17:33:37.745605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:26.615 [2024-07-15 17:33:37.745628] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.615 [2024-07-15 17:33:37.745638] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x274fec0 00:21:26.615 [2024-07-15 17:33:37.745644] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.615 [2024-07-15 17:33:37.746792] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.615 [2024-07-15 17:33:37.746810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:26.615 spare 00:21:26.615 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:26.876 [2024-07-15 17:33:37.930091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:26.876 [2024-07-15 17:33:37.931052] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:26.876 [2024-07-15 17:33:37.931167] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2747390 00:21:26.876 [2024-07-15 17:33:37.931175] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:26.876 [2024-07-15 17:33:37.931311] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25a37c0 00:21:26.876 [2024-07-15 17:33:37.931418] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2747390 00:21:26.876 [2024-07-15 17:33:37.931423] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2747390 00:21:26.876 [2024-07-15 17:33:37.931489] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.876 17:33:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:26.876 17:33:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:26.876 "name": "raid_bdev1", 00:21:26.876 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:26.876 "strip_size_kb": 0, 00:21:26.876 "state": "online", 00:21:26.876 "raid_level": "raid1", 00:21:26.876 "superblock": true, 00:21:26.876 "num_base_bdevs": 2, 00:21:26.876 "num_base_bdevs_discovered": 2, 00:21:26.876 "num_base_bdevs_operational": 2, 00:21:26.876 "base_bdevs_list": [ 00:21:26.876 { 00:21:26.876 "name": "BaseBdev1", 00:21:26.876 "uuid": "b755761e-6009-56b2-8fc1-8d3f5b54ef31", 00:21:26.876 "is_configured": true, 00:21:26.876 "data_offset": 2048, 00:21:26.876 "data_size": 63488 00:21:26.876 }, 00:21:26.877 { 00:21:26.877 "name": "BaseBdev2", 00:21:26.877 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:26.877 "is_configured": true, 00:21:26.877 "data_offset": 2048, 00:21:26.877 "data_size": 63488 00:21:26.877 } 00:21:26.877 ] 00:21:26.877 }' 00:21:26.877 17:33:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:26.877 17:33:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:27.448 17:33:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:27.448 17:33:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:27.708 [2024-07-15 17:33:38.876666] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:27.708 17:33:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:21:27.708 17:33:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.708 17:33:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:27.968 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:21:27.968 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:21:27.968 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:27.968 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:27.968 [2024-07-15 17:33:39.166627] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2746260 00:21:27.968 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:27.968 Zero copy mechanism will not be used. 00:21:27.968 Running I/O for 60 seconds... 00:21:28.229 [2024-07-15 17:33:39.274801] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:28.229 [2024-07-15 17:33:39.281363] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2746260 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.229 "name": "raid_bdev1", 00:21:28.229 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:28.229 "strip_size_kb": 0, 00:21:28.229 "state": "online", 00:21:28.229 "raid_level": "raid1", 00:21:28.229 "superblock": true, 00:21:28.229 "num_base_bdevs": 2, 00:21:28.229 "num_base_bdevs_discovered": 1, 00:21:28.229 "num_base_bdevs_operational": 1, 00:21:28.229 "base_bdevs_list": [ 00:21:28.229 { 00:21:28.229 "name": null, 00:21:28.229 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.229 "is_configured": false, 00:21:28.229 "data_offset": 2048, 00:21:28.229 "data_size": 63488 00:21:28.229 }, 00:21:28.229 { 00:21:28.229 "name": "BaseBdev2", 00:21:28.229 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:28.229 "is_configured": true, 00:21:28.229 "data_offset": 2048, 00:21:28.229 "data_size": 63488 00:21:28.229 } 00:21:28.229 ] 00:21:28.229 }' 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.229 17:33:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:28.798 17:33:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:29.059 [2024-07-15 17:33:40.252265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:29.059 17:33:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:29.059 [2024-07-15 17:33:40.272355] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2642e20 00:21:29.059 [2024-07-15 17:33:40.273974] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:29.319 [2024-07-15 17:33:40.387583] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:29.319 [2024-07-15 17:33:40.387812] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:29.319 [2024-07-15 17:33:40.610715] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:29.319 [2024-07-15 17:33:40.610820] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:29.889 [2024-07-15 17:33:40.940776] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:29.889 [2024-07-15 17:33:41.162690] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:30.149 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:30.149 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:30.149 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:30.149 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:30.149 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:30.149 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.149 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.410 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:30.410 "name": "raid_bdev1", 00:21:30.410 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:30.410 "strip_size_kb": 0, 00:21:30.410 "state": "online", 00:21:30.410 "raid_level": "raid1", 00:21:30.410 "superblock": true, 00:21:30.410 "num_base_bdevs": 2, 00:21:30.410 "num_base_bdevs_discovered": 2, 00:21:30.410 "num_base_bdevs_operational": 2, 00:21:30.410 "process": { 00:21:30.410 "type": "rebuild", 00:21:30.410 "target": "spare", 00:21:30.410 "progress": { 00:21:30.410 "blocks": 12288, 00:21:30.410 "percent": 19 00:21:30.410 } 00:21:30.410 }, 00:21:30.410 "base_bdevs_list": [ 00:21:30.410 { 00:21:30.410 "name": "spare", 00:21:30.410 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:30.410 "is_configured": true, 00:21:30.410 "data_offset": 2048, 00:21:30.410 "data_size": 63488 00:21:30.410 }, 00:21:30.410 { 00:21:30.410 "name": "BaseBdev2", 00:21:30.410 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:30.410 "is_configured": true, 00:21:30.410 "data_offset": 2048, 00:21:30.410 "data_size": 63488 00:21:30.410 } 00:21:30.410 ] 00:21:30.410 }' 00:21:30.410 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:30.410 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:30.410 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:30.410 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:30.410 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:30.410 [2024-07-15 17:33:41.654924] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:30.671 [2024-07-15 17:33:41.737284] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:30.671 [2024-07-15 17:33:41.793445] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:30.671 [2024-07-15 17:33:41.794835] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:30.671 [2024-07-15 17:33:41.794854] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:30.671 [2024-07-15 17:33:41.794860] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:30.671 [2024-07-15 17:33:41.811754] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2746260 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.671 17:33:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.932 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.932 "name": "raid_bdev1", 00:21:30.932 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:30.932 "strip_size_kb": 0, 00:21:30.932 "state": "online", 00:21:30.932 "raid_level": "raid1", 00:21:30.932 "superblock": true, 00:21:30.932 "num_base_bdevs": 2, 00:21:30.932 "num_base_bdevs_discovered": 1, 00:21:30.932 "num_base_bdevs_operational": 1, 00:21:30.932 "base_bdevs_list": [ 00:21:30.932 { 00:21:30.932 "name": null, 00:21:30.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.932 "is_configured": false, 00:21:30.932 "data_offset": 2048, 00:21:30.932 "data_size": 63488 00:21:30.932 }, 00:21:30.932 { 00:21:30.932 "name": "BaseBdev2", 00:21:30.932 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:30.932 "is_configured": true, 00:21:30.932 "data_offset": 2048, 00:21:30.932 "data_size": 63488 00:21:30.932 } 00:21:30.932 ] 00:21:30.932 }' 00:21:30.932 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.932 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:31.502 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:31.502 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:31.502 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:31.502 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:31.502 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:31.502 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.502 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:31.772 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:31.772 "name": "raid_bdev1", 00:21:31.772 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:31.772 "strip_size_kb": 0, 00:21:31.772 "state": "online", 00:21:31.772 "raid_level": "raid1", 00:21:31.772 "superblock": true, 00:21:31.772 "num_base_bdevs": 2, 00:21:31.772 "num_base_bdevs_discovered": 1, 00:21:31.772 "num_base_bdevs_operational": 1, 00:21:31.772 "base_bdevs_list": [ 00:21:31.772 { 00:21:31.772 "name": null, 00:21:31.772 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:31.772 "is_configured": false, 00:21:31.772 "data_offset": 2048, 00:21:31.772 "data_size": 63488 00:21:31.772 }, 00:21:31.772 { 00:21:31.772 "name": "BaseBdev2", 00:21:31.772 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:31.772 "is_configured": true, 00:21:31.772 "data_offset": 2048, 00:21:31.772 "data_size": 63488 00:21:31.772 } 00:21:31.772 ] 00:21:31.772 }' 00:21:31.772 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:31.772 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:31.772 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:31.772 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:31.772 17:33:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:32.094 [2024-07-15 17:33:43.075295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:32.094 17:33:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:32.094 [2024-07-15 17:33:43.127809] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25bda70 00:21:32.095 [2024-07-15 17:33:43.128944] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:32.095 [2024-07-15 17:33:43.236944] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:32.095 [2024-07-15 17:33:43.237168] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:32.357 [2024-07-15 17:33:43.438517] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:32.357 [2024-07-15 17:33:43.438618] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:32.617 [2024-07-15 17:33:43.675431] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:32.877 [2024-07-15 17:33:44.041268] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:32.877 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:32.877 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:32.877 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:32.877 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:32.877 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:32.877 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.877 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.877 [2024-07-15 17:33:44.162608] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:32.877 [2024-07-15 17:33:44.162730] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:33.137 "name": "raid_bdev1", 00:21:33.137 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:33.137 "strip_size_kb": 0, 00:21:33.137 "state": "online", 00:21:33.137 "raid_level": "raid1", 00:21:33.137 "superblock": true, 00:21:33.137 "num_base_bdevs": 2, 00:21:33.137 "num_base_bdevs_discovered": 2, 00:21:33.137 "num_base_bdevs_operational": 2, 00:21:33.137 "process": { 00:21:33.137 "type": "rebuild", 00:21:33.137 "target": "spare", 00:21:33.137 "progress": { 00:21:33.137 "blocks": 16384, 00:21:33.137 "percent": 25 00:21:33.137 } 00:21:33.137 }, 00:21:33.137 "base_bdevs_list": [ 00:21:33.137 { 00:21:33.137 "name": "spare", 00:21:33.137 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:33.137 "is_configured": true, 00:21:33.137 "data_offset": 2048, 00:21:33.137 "data_size": 63488 00:21:33.137 }, 00:21:33.137 { 00:21:33.137 "name": "BaseBdev2", 00:21:33.137 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:33.137 "is_configured": true, 00:21:33.137 "data_offset": 2048, 00:21:33.137 "data_size": 63488 00:21:33.137 } 00:21:33.137 ] 00:21:33.137 }' 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:21:33.137 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=718 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:33.137 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.396 [2024-07-15 17:33:44.508103] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:21:33.396 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:33.396 "name": "raid_bdev1", 00:21:33.396 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:33.396 "strip_size_kb": 0, 00:21:33.396 "state": "online", 00:21:33.396 "raid_level": "raid1", 00:21:33.396 "superblock": true, 00:21:33.396 "num_base_bdevs": 2, 00:21:33.396 "num_base_bdevs_discovered": 2, 00:21:33.396 "num_base_bdevs_operational": 2, 00:21:33.396 "process": { 00:21:33.396 "type": "rebuild", 00:21:33.396 "target": "spare", 00:21:33.396 "progress": { 00:21:33.396 "blocks": 20480, 00:21:33.396 "percent": 32 00:21:33.396 } 00:21:33.396 }, 00:21:33.396 "base_bdevs_list": [ 00:21:33.396 { 00:21:33.396 "name": "spare", 00:21:33.396 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:33.396 "is_configured": true, 00:21:33.396 "data_offset": 2048, 00:21:33.396 "data_size": 63488 00:21:33.396 }, 00:21:33.396 { 00:21:33.396 "name": "BaseBdev2", 00:21:33.396 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:33.396 "is_configured": true, 00:21:33.396 "data_offset": 2048, 00:21:33.396 "data_size": 63488 00:21:33.396 } 00:21:33.396 ] 00:21:33.396 }' 00:21:33.396 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:33.396 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:33.396 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:33.656 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:33.656 17:33:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:33.656 [2024-07-15 17:33:44.743170] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:33.915 [2024-07-15 17:33:45.182909] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:21:34.175 [2024-07-15 17:33:45.391455] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:21:34.435 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:34.435 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:34.435 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:34.435 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:34.435 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:34.435 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:34.435 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.435 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:34.695 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:34.695 "name": "raid_bdev1", 00:21:34.695 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:34.695 "strip_size_kb": 0, 00:21:34.695 "state": "online", 00:21:34.695 "raid_level": "raid1", 00:21:34.695 "superblock": true, 00:21:34.695 "num_base_bdevs": 2, 00:21:34.695 "num_base_bdevs_discovered": 2, 00:21:34.695 "num_base_bdevs_operational": 2, 00:21:34.695 "process": { 00:21:34.695 "type": "rebuild", 00:21:34.695 "target": "spare", 00:21:34.695 "progress": { 00:21:34.695 "blocks": 40960, 00:21:34.695 "percent": 64 00:21:34.695 } 00:21:34.695 }, 00:21:34.695 "base_bdevs_list": [ 00:21:34.695 { 00:21:34.695 "name": "spare", 00:21:34.695 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:34.695 "is_configured": true, 00:21:34.695 "data_offset": 2048, 00:21:34.695 "data_size": 63488 00:21:34.695 }, 00:21:34.695 { 00:21:34.695 "name": "BaseBdev2", 00:21:34.695 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:34.695 "is_configured": true, 00:21:34.695 "data_offset": 2048, 00:21:34.695 "data_size": 63488 00:21:34.695 } 00:21:34.695 ] 00:21:34.695 }' 00:21:34.695 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:34.695 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:34.695 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:34.695 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:34.695 17:33:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:35.264 [2024-07-15 17:33:46.484970] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:21:35.833 17:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:35.833 17:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:35.833 17:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:35.833 17:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:35.833 17:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:35.833 17:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:35.833 17:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.833 17:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.833 [2024-07-15 17:33:47.031783] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:36.092 [2024-07-15 17:33:47.132063] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:36.092 [2024-07-15 17:33:47.133256] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:36.092 "name": "raid_bdev1", 00:21:36.092 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:36.092 "strip_size_kb": 0, 00:21:36.092 "state": "online", 00:21:36.092 "raid_level": "raid1", 00:21:36.092 "superblock": true, 00:21:36.092 "num_base_bdevs": 2, 00:21:36.092 "num_base_bdevs_discovered": 2, 00:21:36.092 "num_base_bdevs_operational": 2, 00:21:36.092 "base_bdevs_list": [ 00:21:36.092 { 00:21:36.092 "name": "spare", 00:21:36.092 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:36.092 "is_configured": true, 00:21:36.092 "data_offset": 2048, 00:21:36.092 "data_size": 63488 00:21:36.092 }, 00:21:36.092 { 00:21:36.092 "name": "BaseBdev2", 00:21:36.092 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:36.092 "is_configured": true, 00:21:36.092 "data_offset": 2048, 00:21:36.092 "data_size": 63488 00:21:36.092 } 00:21:36.092 ] 00:21:36.092 }' 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:36.092 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:36.352 "name": "raid_bdev1", 00:21:36.352 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:36.352 "strip_size_kb": 0, 00:21:36.352 "state": "online", 00:21:36.352 "raid_level": "raid1", 00:21:36.352 "superblock": true, 00:21:36.352 "num_base_bdevs": 2, 00:21:36.352 "num_base_bdevs_discovered": 2, 00:21:36.352 "num_base_bdevs_operational": 2, 00:21:36.352 "base_bdevs_list": [ 00:21:36.352 { 00:21:36.352 "name": "spare", 00:21:36.352 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:36.352 "is_configured": true, 00:21:36.352 "data_offset": 2048, 00:21:36.352 "data_size": 63488 00:21:36.352 }, 00:21:36.352 { 00:21:36.352 "name": "BaseBdev2", 00:21:36.352 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:36.352 "is_configured": true, 00:21:36.352 "data_offset": 2048, 00:21:36.352 "data_size": 63488 00:21:36.352 } 00:21:36.352 ] 00:21:36.352 }' 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.352 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:36.612 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.612 "name": "raid_bdev1", 00:21:36.612 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:36.612 "strip_size_kb": 0, 00:21:36.612 "state": "online", 00:21:36.612 "raid_level": "raid1", 00:21:36.612 "superblock": true, 00:21:36.612 "num_base_bdevs": 2, 00:21:36.612 "num_base_bdevs_discovered": 2, 00:21:36.612 "num_base_bdevs_operational": 2, 00:21:36.612 "base_bdevs_list": [ 00:21:36.612 { 00:21:36.612 "name": "spare", 00:21:36.612 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:36.612 "is_configured": true, 00:21:36.612 "data_offset": 2048, 00:21:36.612 "data_size": 63488 00:21:36.612 }, 00:21:36.612 { 00:21:36.612 "name": "BaseBdev2", 00:21:36.612 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:36.612 "is_configured": true, 00:21:36.612 "data_offset": 2048, 00:21:36.612 "data_size": 63488 00:21:36.612 } 00:21:36.612 ] 00:21:36.612 }' 00:21:36.612 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.612 17:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:37.181 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:37.181 [2024-07-15 17:33:48.445619] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:37.181 [2024-07-15 17:33:48.445643] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:37.441 00:21:37.441 Latency(us) 00:21:37.441 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:37.441 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:37.441 raid_bdev1 : 9.35 118.85 356.56 0.00 0.00 10907.45 247.34 114536.76 00:21:37.441 =================================================================================================================== 00:21:37.441 Total : 118.85 356.56 0.00 0.00 10907.45 247.34 114536.76 00:21:37.441 [2024-07-15 17:33:48.541060] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:37.441 [2024-07-15 17:33:48.541084] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:37.441 [2024-07-15 17:33:48.541141] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:37.441 [2024-07-15 17:33:48.541147] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2747390 name raid_bdev1, state offline 00:21:37.441 0 00:21:37.441 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.441 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:37.701 /dev/nbd0 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:37.701 1+0 records in 00:21:37.701 1+0 records out 00:21:37.701 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274877 s, 14.9 MB/s 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:21:37.701 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:21:37.702 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:37.702 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:21:37.702 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:37.702 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:37.702 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:37.702 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:37.702 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:37.702 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:37.702 17:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:21:37.961 /dev/nbd1 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:37.961 1+0 records in 00:21:37.961 1+0 records out 00:21:37.961 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273783 s, 15.0 MB/s 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:37.961 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:38.221 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:21:38.481 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:38.743 17:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:39.004 [2024-07-15 17:33:50.073266] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:39.004 [2024-07-15 17:33:50.073307] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.004 [2024-07-15 17:33:50.073321] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a8d00 00:21:39.004 [2024-07-15 17:33:50.073328] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.004 [2024-07-15 17:33:50.074644] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.004 [2024-07-15 17:33:50.074679] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:39.004 [2024-07-15 17:33:50.074748] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:39.004 [2024-07-15 17:33:50.074770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:39.004 [2024-07-15 17:33:50.074849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:39.004 spare 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.004 [2024-07-15 17:33:50.175138] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25a6c00 00:21:39.004 [2024-07-15 17:33:50.175147] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:39.004 [2024-07-15 17:33:50.175296] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2745db0 00:21:39.004 [2024-07-15 17:33:50.175406] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25a6c00 00:21:39.004 [2024-07-15 17:33:50.175411] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25a6c00 00:21:39.004 [2024-07-15 17:33:50.175487] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.004 "name": "raid_bdev1", 00:21:39.004 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:39.004 "strip_size_kb": 0, 00:21:39.004 "state": "online", 00:21:39.004 "raid_level": "raid1", 00:21:39.004 "superblock": true, 00:21:39.004 "num_base_bdevs": 2, 00:21:39.004 "num_base_bdevs_discovered": 2, 00:21:39.004 "num_base_bdevs_operational": 2, 00:21:39.004 "base_bdevs_list": [ 00:21:39.004 { 00:21:39.004 "name": "spare", 00:21:39.004 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:39.004 "is_configured": true, 00:21:39.004 "data_offset": 2048, 00:21:39.004 "data_size": 63488 00:21:39.004 }, 00:21:39.004 { 00:21:39.004 "name": "BaseBdev2", 00:21:39.004 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:39.004 "is_configured": true, 00:21:39.004 "data_offset": 2048, 00:21:39.004 "data_size": 63488 00:21:39.004 } 00:21:39.004 ] 00:21:39.004 }' 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.004 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:39.574 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:39.574 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:39.574 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:39.574 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:39.574 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:39.574 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.574 17:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.834 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:39.834 "name": "raid_bdev1", 00:21:39.834 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:39.834 "strip_size_kb": 0, 00:21:39.834 "state": "online", 00:21:39.834 "raid_level": "raid1", 00:21:39.834 "superblock": true, 00:21:39.834 "num_base_bdevs": 2, 00:21:39.834 "num_base_bdevs_discovered": 2, 00:21:39.834 "num_base_bdevs_operational": 2, 00:21:39.834 "base_bdevs_list": [ 00:21:39.834 { 00:21:39.834 "name": "spare", 00:21:39.834 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:39.834 "is_configured": true, 00:21:39.834 "data_offset": 2048, 00:21:39.834 "data_size": 63488 00:21:39.834 }, 00:21:39.834 { 00:21:39.834 "name": "BaseBdev2", 00:21:39.834 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:39.834 "is_configured": true, 00:21:39.834 "data_offset": 2048, 00:21:39.834 "data_size": 63488 00:21:39.834 } 00:21:39.834 ] 00:21:39.834 }' 00:21:39.834 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:39.834 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:39.834 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:39.834 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:39.834 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.834 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:40.095 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:21:40.095 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:40.355 [2024-07-15 17:33:51.481066] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.355 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.615 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.615 "name": "raid_bdev1", 00:21:40.615 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:40.615 "strip_size_kb": 0, 00:21:40.615 "state": "online", 00:21:40.615 "raid_level": "raid1", 00:21:40.615 "superblock": true, 00:21:40.615 "num_base_bdevs": 2, 00:21:40.615 "num_base_bdevs_discovered": 1, 00:21:40.615 "num_base_bdevs_operational": 1, 00:21:40.615 "base_bdevs_list": [ 00:21:40.615 { 00:21:40.615 "name": null, 00:21:40.615 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.615 "is_configured": false, 00:21:40.615 "data_offset": 2048, 00:21:40.615 "data_size": 63488 00:21:40.615 }, 00:21:40.615 { 00:21:40.615 "name": "BaseBdev2", 00:21:40.615 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:40.615 "is_configured": true, 00:21:40.615 "data_offset": 2048, 00:21:40.615 "data_size": 63488 00:21:40.615 } 00:21:40.615 ] 00:21:40.615 }' 00:21:40.615 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.615 17:33:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:41.185 17:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:41.185 [2024-07-15 17:33:52.391489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:41.185 [2024-07-15 17:33:52.391600] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:41.185 [2024-07-15 17:33:52.391609] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:41.185 [2024-07-15 17:33:52.391631] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:41.185 [2024-07-15 17:33:52.395272] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2750150 00:21:41.185 [2024-07-15 17:33:52.396845] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:41.185 17:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:21:42.125 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:42.126 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:42.126 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:42.126 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:42.126 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:42.126 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.126 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.387 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:42.387 "name": "raid_bdev1", 00:21:42.387 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:42.387 "strip_size_kb": 0, 00:21:42.387 "state": "online", 00:21:42.387 "raid_level": "raid1", 00:21:42.387 "superblock": true, 00:21:42.387 "num_base_bdevs": 2, 00:21:42.387 "num_base_bdevs_discovered": 2, 00:21:42.387 "num_base_bdevs_operational": 2, 00:21:42.387 "process": { 00:21:42.387 "type": "rebuild", 00:21:42.387 "target": "spare", 00:21:42.387 "progress": { 00:21:42.387 "blocks": 22528, 00:21:42.387 "percent": 35 00:21:42.387 } 00:21:42.387 }, 00:21:42.387 "base_bdevs_list": [ 00:21:42.387 { 00:21:42.387 "name": "spare", 00:21:42.387 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:42.387 "is_configured": true, 00:21:42.387 "data_offset": 2048, 00:21:42.387 "data_size": 63488 00:21:42.387 }, 00:21:42.387 { 00:21:42.387 "name": "BaseBdev2", 00:21:42.387 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:42.387 "is_configured": true, 00:21:42.387 "data_offset": 2048, 00:21:42.387 "data_size": 63488 00:21:42.387 } 00:21:42.387 ] 00:21:42.387 }' 00:21:42.387 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:42.387 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:42.387 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:42.648 [2024-07-15 17:33:53.889736] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:42.648 [2024-07-15 17:33:53.905746] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:42.648 [2024-07-15 17:33:53.905778] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:42.648 [2024-07-15 17:33:53.905788] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:42.648 [2024-07-15 17:33:53.905792] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.648 17:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.907 17:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.907 "name": "raid_bdev1", 00:21:42.907 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:42.907 "strip_size_kb": 0, 00:21:42.907 "state": "online", 00:21:42.907 "raid_level": "raid1", 00:21:42.907 "superblock": true, 00:21:42.907 "num_base_bdevs": 2, 00:21:42.907 "num_base_bdevs_discovered": 1, 00:21:42.907 "num_base_bdevs_operational": 1, 00:21:42.907 "base_bdevs_list": [ 00:21:42.907 { 00:21:42.907 "name": null, 00:21:42.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.907 "is_configured": false, 00:21:42.907 "data_offset": 2048, 00:21:42.907 "data_size": 63488 00:21:42.907 }, 00:21:42.907 { 00:21:42.907 "name": "BaseBdev2", 00:21:42.907 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:42.907 "is_configured": true, 00:21:42.907 "data_offset": 2048, 00:21:42.907 "data_size": 63488 00:21:42.907 } 00:21:42.907 ] 00:21:42.907 }' 00:21:42.907 17:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.907 17:33:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:43.478 17:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:43.738 [2024-07-15 17:33:54.808145] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:43.738 [2024-07-15 17:33:54.808182] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:43.738 [2024-07-15 17:33:54.808196] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2643af0 00:21:43.738 [2024-07-15 17:33:54.808202] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:43.738 [2024-07-15 17:33:54.808508] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:43.738 [2024-07-15 17:33:54.808520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:43.738 [2024-07-15 17:33:54.808581] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:43.738 [2024-07-15 17:33:54.808588] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:43.738 [2024-07-15 17:33:54.808594] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:43.738 [2024-07-15 17:33:54.808606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:43.738 [2024-07-15 17:33:54.812221] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2642950 00:21:43.738 spare 00:21:43.738 [2024-07-15 17:33:54.813359] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:43.738 17:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:21:44.678 17:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:44.678 17:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:44.678 17:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:44.678 17:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:44.678 17:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:44.678 17:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.678 17:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:44.938 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:44.938 "name": "raid_bdev1", 00:21:44.938 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:44.938 "strip_size_kb": 0, 00:21:44.938 "state": "online", 00:21:44.938 "raid_level": "raid1", 00:21:44.938 "superblock": true, 00:21:44.938 "num_base_bdevs": 2, 00:21:44.938 "num_base_bdevs_discovered": 2, 00:21:44.938 "num_base_bdevs_operational": 2, 00:21:44.938 "process": { 00:21:44.938 "type": "rebuild", 00:21:44.938 "target": "spare", 00:21:44.938 "progress": { 00:21:44.938 "blocks": 22528, 00:21:44.938 "percent": 35 00:21:44.938 } 00:21:44.938 }, 00:21:44.938 "base_bdevs_list": [ 00:21:44.938 { 00:21:44.938 "name": "spare", 00:21:44.938 "uuid": "1866c233-e6fd-5b9d-9f0f-0c94f6845b2a", 00:21:44.938 "is_configured": true, 00:21:44.938 "data_offset": 2048, 00:21:44.938 "data_size": 63488 00:21:44.938 }, 00:21:44.938 { 00:21:44.938 "name": "BaseBdev2", 00:21:44.938 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:44.938 "is_configured": true, 00:21:44.938 "data_offset": 2048, 00:21:44.938 "data_size": 63488 00:21:44.938 } 00:21:44.938 ] 00:21:44.938 }' 00:21:44.938 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:44.938 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:44.938 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:44.938 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:44.938 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:45.198 [2024-07-15 17:33:56.302277] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:45.198 [2024-07-15 17:33:56.322294] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:45.198 [2024-07-15 17:33:56.322324] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:45.198 [2024-07-15 17:33:56.322333] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:45.198 [2024-07-15 17:33:56.322338] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.198 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.199 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.199 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.459 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.459 "name": "raid_bdev1", 00:21:45.459 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:45.459 "strip_size_kb": 0, 00:21:45.459 "state": "online", 00:21:45.459 "raid_level": "raid1", 00:21:45.459 "superblock": true, 00:21:45.459 "num_base_bdevs": 2, 00:21:45.459 "num_base_bdevs_discovered": 1, 00:21:45.459 "num_base_bdevs_operational": 1, 00:21:45.459 "base_bdevs_list": [ 00:21:45.459 { 00:21:45.459 "name": null, 00:21:45.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.459 "is_configured": false, 00:21:45.459 "data_offset": 2048, 00:21:45.459 "data_size": 63488 00:21:45.459 }, 00:21:45.459 { 00:21:45.459 "name": "BaseBdev2", 00:21:45.459 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:45.459 "is_configured": true, 00:21:45.459 "data_offset": 2048, 00:21:45.459 "data_size": 63488 00:21:45.459 } 00:21:45.459 ] 00:21:45.459 }' 00:21:45.459 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.459 17:33:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:46.028 "name": "raid_bdev1", 00:21:46.028 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:46.028 "strip_size_kb": 0, 00:21:46.028 "state": "online", 00:21:46.028 "raid_level": "raid1", 00:21:46.028 "superblock": true, 00:21:46.028 "num_base_bdevs": 2, 00:21:46.028 "num_base_bdevs_discovered": 1, 00:21:46.028 "num_base_bdevs_operational": 1, 00:21:46.028 "base_bdevs_list": [ 00:21:46.028 { 00:21:46.028 "name": null, 00:21:46.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.028 "is_configured": false, 00:21:46.028 "data_offset": 2048, 00:21:46.028 "data_size": 63488 00:21:46.028 }, 00:21:46.028 { 00:21:46.028 "name": "BaseBdev2", 00:21:46.028 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:46.028 "is_configured": true, 00:21:46.028 "data_offset": 2048, 00:21:46.028 "data_size": 63488 00:21:46.028 } 00:21:46.028 ] 00:21:46.028 }' 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:46.028 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:46.290 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:46.290 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:46.290 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:46.558 [2024-07-15 17:33:57.725992] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:46.558 [2024-07-15 17:33:57.726025] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:46.558 [2024-07-15 17:33:57.726038] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25acf60 00:21:46.558 [2024-07-15 17:33:57.726044] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:46.558 [2024-07-15 17:33:57.726328] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:46.558 [2024-07-15 17:33:57.726341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:46.558 [2024-07-15 17:33:57.726388] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:46.558 [2024-07-15 17:33:57.726396] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:46.558 [2024-07-15 17:33:57.726402] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:46.558 BaseBdev1 00:21:46.558 17:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.498 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.758 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.758 "name": "raid_bdev1", 00:21:47.758 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:47.758 "strip_size_kb": 0, 00:21:47.758 "state": "online", 00:21:47.758 "raid_level": "raid1", 00:21:47.758 "superblock": true, 00:21:47.758 "num_base_bdevs": 2, 00:21:47.758 "num_base_bdevs_discovered": 1, 00:21:47.758 "num_base_bdevs_operational": 1, 00:21:47.758 "base_bdevs_list": [ 00:21:47.758 { 00:21:47.758 "name": null, 00:21:47.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.758 "is_configured": false, 00:21:47.758 "data_offset": 2048, 00:21:47.758 "data_size": 63488 00:21:47.758 }, 00:21:47.758 { 00:21:47.758 "name": "BaseBdev2", 00:21:47.758 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:47.758 "is_configured": true, 00:21:47.758 "data_offset": 2048, 00:21:47.758 "data_size": 63488 00:21:47.758 } 00:21:47.758 ] 00:21:47.758 }' 00:21:47.758 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.758 17:33:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:48.326 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:48.326 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:48.326 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:48.326 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:48.326 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:48.326 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.326 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:48.585 "name": "raid_bdev1", 00:21:48.585 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:48.585 "strip_size_kb": 0, 00:21:48.585 "state": "online", 00:21:48.585 "raid_level": "raid1", 00:21:48.585 "superblock": true, 00:21:48.585 "num_base_bdevs": 2, 00:21:48.585 "num_base_bdevs_discovered": 1, 00:21:48.585 "num_base_bdevs_operational": 1, 00:21:48.585 "base_bdevs_list": [ 00:21:48.585 { 00:21:48.585 "name": null, 00:21:48.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.585 "is_configured": false, 00:21:48.585 "data_offset": 2048, 00:21:48.585 "data_size": 63488 00:21:48.585 }, 00:21:48.585 { 00:21:48.585 "name": "BaseBdev2", 00:21:48.585 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:48.585 "is_configured": true, 00:21:48.585 "data_offset": 2048, 00:21:48.585 "data_size": 63488 00:21:48.585 } 00:21:48.585 ] 00:21:48.585 }' 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:48.585 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:48.844 [2024-07-15 17:33:59.935819] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:48.844 [2024-07-15 17:33:59.935909] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:48.844 [2024-07-15 17:33:59.935918] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:48.844 request: 00:21:48.844 { 00:21:48.844 "base_bdev": "BaseBdev1", 00:21:48.844 "raid_bdev": "raid_bdev1", 00:21:48.844 "method": "bdev_raid_add_base_bdev", 00:21:48.844 "req_id": 1 00:21:48.844 } 00:21:48.844 Got JSON-RPC error response 00:21:48.844 response: 00:21:48.845 { 00:21:48.845 "code": -22, 00:21:48.845 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:48.845 } 00:21:48.845 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:21:48.845 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:48.845 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:48.845 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:48.845 17:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.784 17:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.044 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.044 "name": "raid_bdev1", 00:21:50.044 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:50.044 "strip_size_kb": 0, 00:21:50.044 "state": "online", 00:21:50.044 "raid_level": "raid1", 00:21:50.044 "superblock": true, 00:21:50.044 "num_base_bdevs": 2, 00:21:50.044 "num_base_bdevs_discovered": 1, 00:21:50.044 "num_base_bdevs_operational": 1, 00:21:50.044 "base_bdevs_list": [ 00:21:50.044 { 00:21:50.044 "name": null, 00:21:50.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.044 "is_configured": false, 00:21:50.044 "data_offset": 2048, 00:21:50.044 "data_size": 63488 00:21:50.044 }, 00:21:50.044 { 00:21:50.044 "name": "BaseBdev2", 00:21:50.044 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:50.044 "is_configured": true, 00:21:50.044 "data_offset": 2048, 00:21:50.044 "data_size": 63488 00:21:50.044 } 00:21:50.044 ] 00:21:50.044 }' 00:21:50.044 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.044 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:50.636 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:50.636 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:50.636 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:50.636 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:50.636 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:50.636 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.636 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.636 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:50.636 "name": "raid_bdev1", 00:21:50.636 "uuid": "cee4b8a1-5a26-4943-a324-17efc5f8f90c", 00:21:50.636 "strip_size_kb": 0, 00:21:50.636 "state": "online", 00:21:50.636 "raid_level": "raid1", 00:21:50.636 "superblock": true, 00:21:50.636 "num_base_bdevs": 2, 00:21:50.636 "num_base_bdevs_discovered": 1, 00:21:50.636 "num_base_bdevs_operational": 1, 00:21:50.636 "base_bdevs_list": [ 00:21:50.636 { 00:21:50.636 "name": null, 00:21:50.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.636 "is_configured": false, 00:21:50.636 "data_offset": 2048, 00:21:50.636 "data_size": 63488 00:21:50.636 }, 00:21:50.636 { 00:21:50.636 "name": "BaseBdev2", 00:21:50.636 "uuid": "173d195c-e551-5f06-8152-d831278c5c5e", 00:21:50.636 "is_configured": true, 00:21:50.636 "data_offset": 2048, 00:21:50.636 "data_size": 63488 00:21:50.636 } 00:21:50.636 ] 00:21:50.636 }' 00:21:50.636 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:50.896 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:50.896 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:50.896 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:50.896 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2868461 00:21:50.896 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2868461 ']' 00:21:50.896 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2868461 00:21:50.896 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:21:50.896 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:50.896 17:34:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2868461 00:21:50.896 17:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:50.896 17:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:50.896 17:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2868461' 00:21:50.896 killing process with pid 2868461 00:21:50.896 17:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2868461 00:21:50.896 Received shutdown signal, test time was about 22.822031 seconds 00:21:50.896 00:21:50.896 Latency(us) 00:21:50.896 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:50.896 =================================================================================================================== 00:21:50.896 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:50.896 [2024-07-15 17:34:02.044037] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:50.896 [2024-07-15 17:34:02.044112] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:50.896 [2024-07-15 17:34:02.044144] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:50.896 [2024-07-15 17:34:02.044150] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25a6c00 name raid_bdev1, state offline 00:21:50.896 17:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2868461 00:21:50.896 [2024-07-15 17:34:02.055924] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:50.896 17:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:21:50.896 00:21:50.896 real 0m26.635s 00:21:50.896 user 0m42.189s 00:21:50.896 sys 0m3.075s 00:21:50.896 17:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:50.896 17:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:50.896 ************************************ 00:21:50.896 END TEST raid_rebuild_test_sb_io 00:21:50.896 ************************************ 00:21:51.156 17:34:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:51.156 17:34:02 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:21:51.156 17:34:02 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:21:51.157 17:34:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:51.157 17:34:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:51.157 17:34:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:51.157 ************************************ 00:21:51.157 START TEST raid_rebuild_test 00:21:51.157 ************************************ 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2873319 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2873319 /var/tmp/spdk-raid.sock 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2873319 ']' 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:51.157 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:51.157 17:34:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:51.157 [2024-07-15 17:34:02.330316] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:21:51.157 [2024-07-15 17:34:02.330372] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2873319 ] 00:21:51.157 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:51.157 Zero copy mechanism will not be used. 00:21:51.157 [2024-07-15 17:34:02.421279] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:51.417 [2024-07-15 17:34:02.488631] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:51.417 [2024-07-15 17:34:02.537906] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:51.417 [2024-07-15 17:34:02.537928] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:51.986 17:34:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:51.986 17:34:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:21:51.986 17:34:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:51.986 17:34:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:52.245 BaseBdev1_malloc 00:21:52.245 17:34:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:52.245 [2024-07-15 17:34:03.516349] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:52.245 [2024-07-15 17:34:03.516382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:52.245 [2024-07-15 17:34:03.516395] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1948d30 00:21:52.245 [2024-07-15 17:34:03.516402] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:52.245 [2024-07-15 17:34:03.517703] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:52.245 [2024-07-15 17:34:03.517730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:52.245 BaseBdev1 00:21:52.245 17:34:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:52.245 17:34:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:52.504 BaseBdev2_malloc 00:21:52.504 17:34:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:52.764 [2024-07-15 17:34:03.863239] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:52.764 [2024-07-15 17:34:03.863268] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:52.764 [2024-07-15 17:34:03.863278] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1afbc60 00:21:52.764 [2024-07-15 17:34:03.863285] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:52.764 [2024-07-15 17:34:03.864466] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:52.764 [2024-07-15 17:34:03.864485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:52.764 BaseBdev2 00:21:52.764 17:34:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:52.764 17:34:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:53.024 BaseBdev3_malloc 00:21:53.024 17:34:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:53.024 [2024-07-15 17:34:04.262068] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:53.024 [2024-07-15 17:34:04.262096] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.024 [2024-07-15 17:34:04.262107] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ae0b90 00:21:53.024 [2024-07-15 17:34:04.262114] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.024 [2024-07-15 17:34:04.263292] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.024 [2024-07-15 17:34:04.263310] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:53.024 BaseBdev3 00:21:53.024 17:34:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:53.024 17:34:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:53.284 BaseBdev4_malloc 00:21:53.284 17:34:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:53.543 [2024-07-15 17:34:04.628919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:53.543 [2024-07-15 17:34:04.628944] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.543 [2024-07-15 17:34:04.628955] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19498c0 00:21:53.543 [2024-07-15 17:34:04.628961] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.543 [2024-07-15 17:34:04.630143] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.543 [2024-07-15 17:34:04.630161] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:53.543 BaseBdev4 00:21:53.543 17:34:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:53.543 spare_malloc 00:21:53.543 17:34:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:53.803 spare_delay 00:21:53.803 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:54.062 [2024-07-15 17:34:05.176172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:54.062 [2024-07-15 17:34:05.176202] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:54.062 [2024-07-15 17:34:05.176213] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1941a80 00:21:54.062 [2024-07-15 17:34:05.176219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:54.062 [2024-07-15 17:34:05.177407] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:54.062 [2024-07-15 17:34:05.177427] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:54.062 spare 00:21:54.062 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:54.062 [2024-07-15 17:34:05.352635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:54.062 [2024-07-15 17:34:05.353618] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:54.062 [2024-07-15 17:34:05.353657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:54.062 [2024-07-15 17:34:05.353689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:54.062 [2024-07-15 17:34:05.353752] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1942b30 00:21:54.062 [2024-07-15 17:34:05.353758] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:54.062 [2024-07-15 17:34:05.353910] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1946bf0 00:21:54.062 [2024-07-15 17:34:05.354024] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1942b30 00:21:54.062 [2024-07-15 17:34:05.354029] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1942b30 00:21:54.062 [2024-07-15 17:34:05.354110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:54.321 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.321 "name": "raid_bdev1", 00:21:54.321 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:21:54.321 "strip_size_kb": 0, 00:21:54.322 "state": "online", 00:21:54.322 "raid_level": "raid1", 00:21:54.322 "superblock": false, 00:21:54.322 "num_base_bdevs": 4, 00:21:54.322 "num_base_bdevs_discovered": 4, 00:21:54.322 "num_base_bdevs_operational": 4, 00:21:54.322 "base_bdevs_list": [ 00:21:54.322 { 00:21:54.322 "name": "BaseBdev1", 00:21:54.322 "uuid": "9946bede-df81-51fd-902f-54f1224782a6", 00:21:54.322 "is_configured": true, 00:21:54.322 "data_offset": 0, 00:21:54.322 "data_size": 65536 00:21:54.322 }, 00:21:54.322 { 00:21:54.322 "name": "BaseBdev2", 00:21:54.322 "uuid": "d7e96e54-d68e-536e-b233-d6cd8c6d0732", 00:21:54.322 "is_configured": true, 00:21:54.322 "data_offset": 0, 00:21:54.322 "data_size": 65536 00:21:54.322 }, 00:21:54.322 { 00:21:54.322 "name": "BaseBdev3", 00:21:54.322 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:21:54.322 "is_configured": true, 00:21:54.322 "data_offset": 0, 00:21:54.322 "data_size": 65536 00:21:54.322 }, 00:21:54.322 { 00:21:54.322 "name": "BaseBdev4", 00:21:54.322 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:21:54.322 "is_configured": true, 00:21:54.322 "data_offset": 0, 00:21:54.322 "data_size": 65536 00:21:54.322 } 00:21:54.322 ] 00:21:54.322 }' 00:21:54.322 17:34:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.322 17:34:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.890 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:54.890 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:55.149 [2024-07-15 17:34:06.259149] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:55.149 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:21:55.149 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.149 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:55.409 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:55.409 [2024-07-15 17:34:06.647918] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1943b00 00:21:55.409 /dev/nbd0 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:55.410 1+0 records in 00:21:55.410 1+0 records out 00:21:55.410 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308379 s, 13.3 MB/s 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:55.410 17:34:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:22:07.636 65536+0 records in 00:22:07.636 65536+0 records out 00:22:07.636 33554432 bytes (34 MB, 32 MiB) copied, 10.255 s, 3.3 MB/s 00:22:07.636 17:34:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:07.636 17:34:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:07.636 17:34:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:07.636 17:34:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:07.636 17:34:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:07.636 17:34:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:07.636 17:34:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:07.636 [2024-07-15 17:34:17.164090] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:07.636 [2024-07-15 17:34:17.344527] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.636 "name": "raid_bdev1", 00:22:07.636 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:07.636 "strip_size_kb": 0, 00:22:07.636 "state": "online", 00:22:07.636 "raid_level": "raid1", 00:22:07.636 "superblock": false, 00:22:07.636 "num_base_bdevs": 4, 00:22:07.636 "num_base_bdevs_discovered": 3, 00:22:07.636 "num_base_bdevs_operational": 3, 00:22:07.636 "base_bdevs_list": [ 00:22:07.636 { 00:22:07.636 "name": null, 00:22:07.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.636 "is_configured": false, 00:22:07.636 "data_offset": 0, 00:22:07.636 "data_size": 65536 00:22:07.636 }, 00:22:07.636 { 00:22:07.636 "name": "BaseBdev2", 00:22:07.636 "uuid": "d7e96e54-d68e-536e-b233-d6cd8c6d0732", 00:22:07.636 "is_configured": true, 00:22:07.636 "data_offset": 0, 00:22:07.636 "data_size": 65536 00:22:07.636 }, 00:22:07.636 { 00:22:07.636 "name": "BaseBdev3", 00:22:07.636 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:07.636 "is_configured": true, 00:22:07.636 "data_offset": 0, 00:22:07.636 "data_size": 65536 00:22:07.636 }, 00:22:07.636 { 00:22:07.636 "name": "BaseBdev4", 00:22:07.636 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:07.636 "is_configured": true, 00:22:07.636 "data_offset": 0, 00:22:07.636 "data_size": 65536 00:22:07.636 } 00:22:07.636 ] 00:22:07.636 }' 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.636 17:34:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.636 17:34:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:07.636 [2024-07-15 17:34:18.635798] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:07.636 [2024-07-15 17:34:18.638520] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae28b0 00:22:07.636 [2024-07-15 17:34:18.640119] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:07.636 17:34:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:08.575 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:08.575 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:08.575 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:08.575 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:08.575 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:08.575 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.575 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:08.575 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:08.575 "name": "raid_bdev1", 00:22:08.575 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:08.575 "strip_size_kb": 0, 00:22:08.575 "state": "online", 00:22:08.575 "raid_level": "raid1", 00:22:08.575 "superblock": false, 00:22:08.575 "num_base_bdevs": 4, 00:22:08.575 "num_base_bdevs_discovered": 4, 00:22:08.575 "num_base_bdevs_operational": 4, 00:22:08.575 "process": { 00:22:08.575 "type": "rebuild", 00:22:08.575 "target": "spare", 00:22:08.575 "progress": { 00:22:08.575 "blocks": 22528, 00:22:08.575 "percent": 34 00:22:08.575 } 00:22:08.575 }, 00:22:08.575 "base_bdevs_list": [ 00:22:08.575 { 00:22:08.575 "name": "spare", 00:22:08.575 "uuid": "c69b7732-4bb1-525e-9bef-d2a520ec4b9f", 00:22:08.575 "is_configured": true, 00:22:08.575 "data_offset": 0, 00:22:08.575 "data_size": 65536 00:22:08.575 }, 00:22:08.575 { 00:22:08.575 "name": "BaseBdev2", 00:22:08.575 "uuid": "d7e96e54-d68e-536e-b233-d6cd8c6d0732", 00:22:08.575 "is_configured": true, 00:22:08.575 "data_offset": 0, 00:22:08.575 "data_size": 65536 00:22:08.575 }, 00:22:08.575 { 00:22:08.575 "name": "BaseBdev3", 00:22:08.575 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:08.575 "is_configured": true, 00:22:08.575 "data_offset": 0, 00:22:08.575 "data_size": 65536 00:22:08.575 }, 00:22:08.575 { 00:22:08.575 "name": "BaseBdev4", 00:22:08.575 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:08.575 "is_configured": true, 00:22:08.575 "data_offset": 0, 00:22:08.575 "data_size": 65536 00:22:08.575 } 00:22:08.575 ] 00:22:08.575 }' 00:22:08.575 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:08.835 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:08.835 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:08.835 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:08.835 17:34:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:08.835 [2024-07-15 17:34:20.124598] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:09.094 [2024-07-15 17:34:20.149001] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:09.094 [2024-07-15 17:34:20.149034] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.094 [2024-07-15 17:34:20.149045] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:09.094 [2024-07-15 17:34:20.149050] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.094 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.094 "name": "raid_bdev1", 00:22:09.094 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:09.094 "strip_size_kb": 0, 00:22:09.094 "state": "online", 00:22:09.094 "raid_level": "raid1", 00:22:09.094 "superblock": false, 00:22:09.094 "num_base_bdevs": 4, 00:22:09.094 "num_base_bdevs_discovered": 3, 00:22:09.094 "num_base_bdevs_operational": 3, 00:22:09.094 "base_bdevs_list": [ 00:22:09.094 { 00:22:09.094 "name": null, 00:22:09.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:09.094 "is_configured": false, 00:22:09.094 "data_offset": 0, 00:22:09.094 "data_size": 65536 00:22:09.094 }, 00:22:09.094 { 00:22:09.094 "name": "BaseBdev2", 00:22:09.094 "uuid": "d7e96e54-d68e-536e-b233-d6cd8c6d0732", 00:22:09.094 "is_configured": true, 00:22:09.094 "data_offset": 0, 00:22:09.094 "data_size": 65536 00:22:09.094 }, 00:22:09.094 { 00:22:09.094 "name": "BaseBdev3", 00:22:09.094 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:09.094 "is_configured": true, 00:22:09.094 "data_offset": 0, 00:22:09.094 "data_size": 65536 00:22:09.094 }, 00:22:09.094 { 00:22:09.094 "name": "BaseBdev4", 00:22:09.094 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:09.094 "is_configured": true, 00:22:09.094 "data_offset": 0, 00:22:09.094 "data_size": 65536 00:22:09.094 } 00:22:09.094 ] 00:22:09.094 }' 00:22:09.095 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:09.095 17:34:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:09.662 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:09.662 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:09.662 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:09.662 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:09.662 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:09.662 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.662 17:34:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.968 17:34:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:09.968 "name": "raid_bdev1", 00:22:09.968 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:09.968 "strip_size_kb": 0, 00:22:09.968 "state": "online", 00:22:09.968 "raid_level": "raid1", 00:22:09.968 "superblock": false, 00:22:09.968 "num_base_bdevs": 4, 00:22:09.968 "num_base_bdevs_discovered": 3, 00:22:09.968 "num_base_bdevs_operational": 3, 00:22:09.968 "base_bdevs_list": [ 00:22:09.968 { 00:22:09.968 "name": null, 00:22:09.968 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:09.968 "is_configured": false, 00:22:09.968 "data_offset": 0, 00:22:09.968 "data_size": 65536 00:22:09.968 }, 00:22:09.968 { 00:22:09.968 "name": "BaseBdev2", 00:22:09.968 "uuid": "d7e96e54-d68e-536e-b233-d6cd8c6d0732", 00:22:09.968 "is_configured": true, 00:22:09.968 "data_offset": 0, 00:22:09.968 "data_size": 65536 00:22:09.968 }, 00:22:09.968 { 00:22:09.968 "name": "BaseBdev3", 00:22:09.968 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:09.968 "is_configured": true, 00:22:09.968 "data_offset": 0, 00:22:09.968 "data_size": 65536 00:22:09.968 }, 00:22:09.968 { 00:22:09.968 "name": "BaseBdev4", 00:22:09.968 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:09.968 "is_configured": true, 00:22:09.968 "data_offset": 0, 00:22:09.968 "data_size": 65536 00:22:09.968 } 00:22:09.968 ] 00:22:09.968 }' 00:22:09.968 17:34:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:09.968 17:34:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:09.968 17:34:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:09.968 17:34:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:09.968 17:34:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:10.236 [2024-07-15 17:34:21.367749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:10.236 [2024-07-15 17:34:21.370515] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae2880 00:22:10.236 [2024-07-15 17:34:21.371677] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:10.236 17:34:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:11.173 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:11.173 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:11.173 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:11.173 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:11.173 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:11.173 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.173 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:11.433 "name": "raid_bdev1", 00:22:11.433 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:11.433 "strip_size_kb": 0, 00:22:11.433 "state": "online", 00:22:11.433 "raid_level": "raid1", 00:22:11.433 "superblock": false, 00:22:11.433 "num_base_bdevs": 4, 00:22:11.433 "num_base_bdevs_discovered": 4, 00:22:11.433 "num_base_bdevs_operational": 4, 00:22:11.433 "process": { 00:22:11.433 "type": "rebuild", 00:22:11.433 "target": "spare", 00:22:11.433 "progress": { 00:22:11.433 "blocks": 22528, 00:22:11.433 "percent": 34 00:22:11.433 } 00:22:11.433 }, 00:22:11.433 "base_bdevs_list": [ 00:22:11.433 { 00:22:11.433 "name": "spare", 00:22:11.433 "uuid": "c69b7732-4bb1-525e-9bef-d2a520ec4b9f", 00:22:11.433 "is_configured": true, 00:22:11.433 "data_offset": 0, 00:22:11.433 "data_size": 65536 00:22:11.433 }, 00:22:11.433 { 00:22:11.433 "name": "BaseBdev2", 00:22:11.433 "uuid": "d7e96e54-d68e-536e-b233-d6cd8c6d0732", 00:22:11.433 "is_configured": true, 00:22:11.433 "data_offset": 0, 00:22:11.433 "data_size": 65536 00:22:11.433 }, 00:22:11.433 { 00:22:11.433 "name": "BaseBdev3", 00:22:11.433 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:11.433 "is_configured": true, 00:22:11.433 "data_offset": 0, 00:22:11.433 "data_size": 65536 00:22:11.433 }, 00:22:11.433 { 00:22:11.433 "name": "BaseBdev4", 00:22:11.433 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:11.433 "is_configured": true, 00:22:11.433 "data_offset": 0, 00:22:11.433 "data_size": 65536 00:22:11.433 } 00:22:11.433 ] 00:22:11.433 }' 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:22:11.433 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:11.692 [2024-07-15 17:34:22.856439] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:11.692 [2024-07-15 17:34:22.880460] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1ae2880 00:22:11.692 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:22:11.692 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:22:11.692 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:11.692 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:11.692 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:11.692 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:11.692 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:11.692 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.692 17:34:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.952 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:11.952 "name": "raid_bdev1", 00:22:11.952 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:11.952 "strip_size_kb": 0, 00:22:11.952 "state": "online", 00:22:11.952 "raid_level": "raid1", 00:22:11.952 "superblock": false, 00:22:11.952 "num_base_bdevs": 4, 00:22:11.953 "num_base_bdevs_discovered": 3, 00:22:11.953 "num_base_bdevs_operational": 3, 00:22:11.953 "process": { 00:22:11.953 "type": "rebuild", 00:22:11.953 "target": "spare", 00:22:11.953 "progress": { 00:22:11.953 "blocks": 32768, 00:22:11.953 "percent": 50 00:22:11.953 } 00:22:11.953 }, 00:22:11.953 "base_bdevs_list": [ 00:22:11.953 { 00:22:11.953 "name": "spare", 00:22:11.953 "uuid": "c69b7732-4bb1-525e-9bef-d2a520ec4b9f", 00:22:11.953 "is_configured": true, 00:22:11.953 "data_offset": 0, 00:22:11.953 "data_size": 65536 00:22:11.953 }, 00:22:11.953 { 00:22:11.953 "name": null, 00:22:11.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.953 "is_configured": false, 00:22:11.953 "data_offset": 0, 00:22:11.953 "data_size": 65536 00:22:11.953 }, 00:22:11.953 { 00:22:11.953 "name": "BaseBdev3", 00:22:11.953 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:11.953 "is_configured": true, 00:22:11.953 "data_offset": 0, 00:22:11.953 "data_size": 65536 00:22:11.953 }, 00:22:11.953 { 00:22:11.953 "name": "BaseBdev4", 00:22:11.953 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:11.953 "is_configured": true, 00:22:11.953 "data_offset": 0, 00:22:11.953 "data_size": 65536 00:22:11.953 } 00:22:11.953 ] 00:22:11.953 }' 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=757 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.953 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.212 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:12.212 "name": "raid_bdev1", 00:22:12.212 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:12.212 "strip_size_kb": 0, 00:22:12.212 "state": "online", 00:22:12.212 "raid_level": "raid1", 00:22:12.212 "superblock": false, 00:22:12.212 "num_base_bdevs": 4, 00:22:12.212 "num_base_bdevs_discovered": 3, 00:22:12.212 "num_base_bdevs_operational": 3, 00:22:12.212 "process": { 00:22:12.212 "type": "rebuild", 00:22:12.212 "target": "spare", 00:22:12.212 "progress": { 00:22:12.212 "blocks": 38912, 00:22:12.212 "percent": 59 00:22:12.212 } 00:22:12.212 }, 00:22:12.212 "base_bdevs_list": [ 00:22:12.212 { 00:22:12.212 "name": "spare", 00:22:12.212 "uuid": "c69b7732-4bb1-525e-9bef-d2a520ec4b9f", 00:22:12.212 "is_configured": true, 00:22:12.212 "data_offset": 0, 00:22:12.212 "data_size": 65536 00:22:12.212 }, 00:22:12.212 { 00:22:12.212 "name": null, 00:22:12.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:12.212 "is_configured": false, 00:22:12.212 "data_offset": 0, 00:22:12.212 "data_size": 65536 00:22:12.212 }, 00:22:12.212 { 00:22:12.212 "name": "BaseBdev3", 00:22:12.212 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:12.212 "is_configured": true, 00:22:12.212 "data_offset": 0, 00:22:12.212 "data_size": 65536 00:22:12.212 }, 00:22:12.212 { 00:22:12.212 "name": "BaseBdev4", 00:22:12.212 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:12.212 "is_configured": true, 00:22:12.212 "data_offset": 0, 00:22:12.212 "data_size": 65536 00:22:12.213 } 00:22:12.213 ] 00:22:12.213 }' 00:22:12.213 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:12.213 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:12.213 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:12.213 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:12.213 17:34:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:13.596 [2024-07-15 17:34:24.590414] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:13.596 [2024-07-15 17:34:24.590456] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:13.596 [2024-07-15 17:34:24.590482] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:13.596 "name": "raid_bdev1", 00:22:13.596 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:13.596 "strip_size_kb": 0, 00:22:13.596 "state": "online", 00:22:13.596 "raid_level": "raid1", 00:22:13.596 "superblock": false, 00:22:13.596 "num_base_bdevs": 4, 00:22:13.596 "num_base_bdevs_discovered": 3, 00:22:13.596 "num_base_bdevs_operational": 3, 00:22:13.596 "base_bdevs_list": [ 00:22:13.596 { 00:22:13.596 "name": "spare", 00:22:13.596 "uuid": "c69b7732-4bb1-525e-9bef-d2a520ec4b9f", 00:22:13.596 "is_configured": true, 00:22:13.596 "data_offset": 0, 00:22:13.596 "data_size": 65536 00:22:13.596 }, 00:22:13.596 { 00:22:13.596 "name": null, 00:22:13.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.596 "is_configured": false, 00:22:13.596 "data_offset": 0, 00:22:13.596 "data_size": 65536 00:22:13.596 }, 00:22:13.596 { 00:22:13.596 "name": "BaseBdev3", 00:22:13.596 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:13.596 "is_configured": true, 00:22:13.596 "data_offset": 0, 00:22:13.596 "data_size": 65536 00:22:13.596 }, 00:22:13.596 { 00:22:13.596 "name": "BaseBdev4", 00:22:13.596 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:13.596 "is_configured": true, 00:22:13.596 "data_offset": 0, 00:22:13.596 "data_size": 65536 00:22:13.596 } 00:22:13.596 ] 00:22:13.596 }' 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:13.596 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.856 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:13.856 "name": "raid_bdev1", 00:22:13.856 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:13.856 "strip_size_kb": 0, 00:22:13.856 "state": "online", 00:22:13.856 "raid_level": "raid1", 00:22:13.856 "superblock": false, 00:22:13.856 "num_base_bdevs": 4, 00:22:13.856 "num_base_bdevs_discovered": 3, 00:22:13.856 "num_base_bdevs_operational": 3, 00:22:13.856 "base_bdevs_list": [ 00:22:13.856 { 00:22:13.856 "name": "spare", 00:22:13.856 "uuid": "c69b7732-4bb1-525e-9bef-d2a520ec4b9f", 00:22:13.856 "is_configured": true, 00:22:13.856 "data_offset": 0, 00:22:13.856 "data_size": 65536 00:22:13.856 }, 00:22:13.856 { 00:22:13.856 "name": null, 00:22:13.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.856 "is_configured": false, 00:22:13.856 "data_offset": 0, 00:22:13.856 "data_size": 65536 00:22:13.856 }, 00:22:13.856 { 00:22:13.856 "name": "BaseBdev3", 00:22:13.856 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:13.856 "is_configured": true, 00:22:13.856 "data_offset": 0, 00:22:13.856 "data_size": 65536 00:22:13.856 }, 00:22:13.856 { 00:22:13.856 "name": "BaseBdev4", 00:22:13.856 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:13.856 "is_configured": true, 00:22:13.856 "data_offset": 0, 00:22:13.856 "data_size": 65536 00:22:13.856 } 00:22:13.856 ] 00:22:13.856 }' 00:22:13.856 17:34:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.856 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.116 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.116 "name": "raid_bdev1", 00:22:14.116 "uuid": "5076f140-a391-4752-90a7-7660ebd8222a", 00:22:14.116 "strip_size_kb": 0, 00:22:14.116 "state": "online", 00:22:14.116 "raid_level": "raid1", 00:22:14.116 "superblock": false, 00:22:14.116 "num_base_bdevs": 4, 00:22:14.116 "num_base_bdevs_discovered": 3, 00:22:14.116 "num_base_bdevs_operational": 3, 00:22:14.116 "base_bdevs_list": [ 00:22:14.116 { 00:22:14.116 "name": "spare", 00:22:14.116 "uuid": "c69b7732-4bb1-525e-9bef-d2a520ec4b9f", 00:22:14.116 "is_configured": true, 00:22:14.116 "data_offset": 0, 00:22:14.116 "data_size": 65536 00:22:14.116 }, 00:22:14.116 { 00:22:14.116 "name": null, 00:22:14.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:14.116 "is_configured": false, 00:22:14.116 "data_offset": 0, 00:22:14.116 "data_size": 65536 00:22:14.116 }, 00:22:14.116 { 00:22:14.116 "name": "BaseBdev3", 00:22:14.116 "uuid": "9c19f75f-05dd-50f2-b354-a4f096f1b2cf", 00:22:14.116 "is_configured": true, 00:22:14.116 "data_offset": 0, 00:22:14.116 "data_size": 65536 00:22:14.116 }, 00:22:14.116 { 00:22:14.116 "name": "BaseBdev4", 00:22:14.116 "uuid": "35237bf7-fe9b-5b3a-a401-b7f4c0196e54", 00:22:14.116 "is_configured": true, 00:22:14.116 "data_offset": 0, 00:22:14.116 "data_size": 65536 00:22:14.116 } 00:22:14.116 ] 00:22:14.116 }' 00:22:14.116 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.116 17:34:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.687 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:14.687 [2024-07-15 17:34:25.925606] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:14.687 [2024-07-15 17:34:25.925624] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:14.687 [2024-07-15 17:34:25.925662] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:14.687 [2024-07-15 17:34:25.925721] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:14.687 [2024-07-15 17:34:25.925728] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1942b30 name raid_bdev1, state offline 00:22:14.687 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.687 17:34:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:14.948 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:15.209 /dev/nbd0 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:15.209 1+0 records in 00:22:15.209 1+0 records out 00:22:15.209 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274837 s, 14.9 MB/s 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:15.209 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:15.469 /dev/nbd1 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:15.469 1+0 records in 00:22:15.469 1+0 records out 00:22:15.469 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285627 s, 14.3 MB/s 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:15.469 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:15.470 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:15.730 17:34:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2873319 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2873319 ']' 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2873319 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2873319 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2873319' 00:22:15.990 killing process with pid 2873319 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2873319 00:22:15.990 Received shutdown signal, test time was about 60.000000 seconds 00:22:15.990 00:22:15.990 Latency(us) 00:22:15.990 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:15.990 =================================================================================================================== 00:22:15.990 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:15.990 [2024-07-15 17:34:27.145692] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:15.990 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2873319 00:22:15.990 [2024-07-15 17:34:27.171144] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:22:16.251 00:22:16.251 real 0m25.037s 00:22:16.251 user 0m32.334s 00:22:16.251 sys 0m4.536s 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:16.251 ************************************ 00:22:16.251 END TEST raid_rebuild_test 00:22:16.251 ************************************ 00:22:16.251 17:34:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:16.251 17:34:27 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:22:16.251 17:34:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:16.251 17:34:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:16.251 17:34:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:16.251 ************************************ 00:22:16.251 START TEST raid_rebuild_test_sb 00:22:16.251 ************************************ 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:16.251 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2877713 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2877713 /var/tmp/spdk-raid.sock 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2877713 ']' 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:16.252 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:16.252 17:34:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:16.252 [2024-07-15 17:34:27.440064] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:22:16.252 [2024-07-15 17:34:27.440134] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2877713 ] 00:22:16.252 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:16.252 Zero copy mechanism will not be used. 00:22:16.252 [2024-07-15 17:34:27.531447] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:16.512 [2024-07-15 17:34:27.605801] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:16.512 [2024-07-15 17:34:27.652010] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:16.512 [2024-07-15 17:34:27.652040] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:16.772 17:34:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:16.772 17:34:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:16.772 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:16.772 17:34:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:17.032 BaseBdev1_malloc 00:22:17.033 17:34:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:17.033 [2024-07-15 17:34:28.310420] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:17.033 [2024-07-15 17:34:28.310458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:17.033 [2024-07-15 17:34:28.310472] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2109d30 00:22:17.033 [2024-07-15 17:34:28.310478] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:17.033 [2024-07-15 17:34:28.311770] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:17.033 [2024-07-15 17:34:28.311793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:17.033 BaseBdev1 00:22:17.033 17:34:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:17.033 17:34:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:17.604 BaseBdev2_malloc 00:22:17.605 17:34:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:17.864 [2024-07-15 17:34:29.050174] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:17.864 [2024-07-15 17:34:29.050206] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:17.864 [2024-07-15 17:34:29.050221] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22bcc60 00:22:17.864 [2024-07-15 17:34:29.050228] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:17.864 [2024-07-15 17:34:29.051414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:17.864 [2024-07-15 17:34:29.051434] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:17.864 BaseBdev2 00:22:17.864 17:34:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:17.864 17:34:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:18.124 BaseBdev3_malloc 00:22:18.124 17:34:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:18.384 [2024-07-15 17:34:29.436935] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:18.385 [2024-07-15 17:34:29.436963] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:18.385 [2024-07-15 17:34:29.436974] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22a1b90 00:22:18.385 [2024-07-15 17:34:29.436980] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:18.385 [2024-07-15 17:34:29.438122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:18.385 [2024-07-15 17:34:29.438140] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:18.385 BaseBdev3 00:22:18.385 17:34:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:18.385 17:34:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:18.385 BaseBdev4_malloc 00:22:18.385 17:34:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:18.643 [2024-07-15 17:34:29.807541] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:18.643 [2024-07-15 17:34:29.807566] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:18.643 [2024-07-15 17:34:29.807577] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210a8c0 00:22:18.643 [2024-07-15 17:34:29.807583] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:18.643 [2024-07-15 17:34:29.808721] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:18.643 [2024-07-15 17:34:29.808739] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:18.643 BaseBdev4 00:22:18.643 17:34:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:18.902 spare_malloc 00:22:18.902 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:19.161 spare_delay 00:22:19.161 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:19.161 [2024-07-15 17:34:30.394921] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:19.161 [2024-07-15 17:34:30.394954] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.161 [2024-07-15 17:34:30.394965] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2102a80 00:22:19.161 [2024-07-15 17:34:30.394971] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.161 [2024-07-15 17:34:30.396143] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.161 [2024-07-15 17:34:30.396166] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:19.161 spare 00:22:19.161 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:19.421 [2024-07-15 17:34:30.591439] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:19.421 [2024-07-15 17:34:30.592407] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:19.421 [2024-07-15 17:34:30.592447] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:19.421 [2024-07-15 17:34:30.592479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:19.421 [2024-07-15 17:34:30.592619] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2103b30 00:22:19.421 [2024-07-15 17:34:30.592626] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:19.421 [2024-07-15 17:34:30.592774] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21012e0 00:22:19.421 [2024-07-15 17:34:30.592887] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2103b30 00:22:19.421 [2024-07-15 17:34:30.592892] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2103b30 00:22:19.421 [2024-07-15 17:34:30.592959] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.421 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.680 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.680 "name": "raid_bdev1", 00:22:19.680 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:19.680 "strip_size_kb": 0, 00:22:19.680 "state": "online", 00:22:19.680 "raid_level": "raid1", 00:22:19.680 "superblock": true, 00:22:19.680 "num_base_bdevs": 4, 00:22:19.680 "num_base_bdevs_discovered": 4, 00:22:19.680 "num_base_bdevs_operational": 4, 00:22:19.680 "base_bdevs_list": [ 00:22:19.680 { 00:22:19.680 "name": "BaseBdev1", 00:22:19.680 "uuid": "45bddf8f-49b3-5706-aea8-5c216629c7f5", 00:22:19.680 "is_configured": true, 00:22:19.680 "data_offset": 2048, 00:22:19.680 "data_size": 63488 00:22:19.680 }, 00:22:19.680 { 00:22:19.680 "name": "BaseBdev2", 00:22:19.680 "uuid": "5b68c79a-7eb5-5003-9bfd-296d49081d24", 00:22:19.680 "is_configured": true, 00:22:19.680 "data_offset": 2048, 00:22:19.680 "data_size": 63488 00:22:19.680 }, 00:22:19.680 { 00:22:19.680 "name": "BaseBdev3", 00:22:19.680 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:19.680 "is_configured": true, 00:22:19.680 "data_offset": 2048, 00:22:19.680 "data_size": 63488 00:22:19.680 }, 00:22:19.680 { 00:22:19.680 "name": "BaseBdev4", 00:22:19.680 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:19.680 "is_configured": true, 00:22:19.680 "data_offset": 2048, 00:22:19.680 "data_size": 63488 00:22:19.680 } 00:22:19.680 ] 00:22:19.680 }' 00:22:19.680 17:34:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.680 17:34:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:20.248 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:20.248 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:20.248 [2024-07-15 17:34:31.538060] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:20.508 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:20.768 [2024-07-15 17:34:31.922818] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2108a60 00:22:20.768 /dev/nbd0 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:20.768 1+0 records in 00:22:20.768 1+0 records out 00:22:20.768 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299383 s, 13.7 MB/s 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:20.768 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:20.769 17:34:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:20.769 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:20.769 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:20.769 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:20.769 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:20.769 17:34:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:22:30.788 63488+0 records in 00:22:30.788 63488+0 records out 00:22:30.788 32505856 bytes (33 MB, 31 MiB) copied, 8.74857 s, 3.7 MB/s 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:30.788 17:34:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:30.788 [2024-07-15 17:34:40.938397] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:30.788 [2024-07-15 17:34:41.114882] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:30.788 "name": "raid_bdev1", 00:22:30.788 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:30.788 "strip_size_kb": 0, 00:22:30.788 "state": "online", 00:22:30.788 "raid_level": "raid1", 00:22:30.788 "superblock": true, 00:22:30.788 "num_base_bdevs": 4, 00:22:30.788 "num_base_bdevs_discovered": 3, 00:22:30.788 "num_base_bdevs_operational": 3, 00:22:30.788 "base_bdevs_list": [ 00:22:30.788 { 00:22:30.788 "name": null, 00:22:30.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.788 "is_configured": false, 00:22:30.788 "data_offset": 2048, 00:22:30.788 "data_size": 63488 00:22:30.788 }, 00:22:30.788 { 00:22:30.788 "name": "BaseBdev2", 00:22:30.788 "uuid": "5b68c79a-7eb5-5003-9bfd-296d49081d24", 00:22:30.788 "is_configured": true, 00:22:30.788 "data_offset": 2048, 00:22:30.788 "data_size": 63488 00:22:30.788 }, 00:22:30.788 { 00:22:30.788 "name": "BaseBdev3", 00:22:30.788 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:30.788 "is_configured": true, 00:22:30.788 "data_offset": 2048, 00:22:30.788 "data_size": 63488 00:22:30.788 }, 00:22:30.788 { 00:22:30.788 "name": "BaseBdev4", 00:22:30.788 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:30.788 "is_configured": true, 00:22:30.788 "data_offset": 2048, 00:22:30.788 "data_size": 63488 00:22:30.788 } 00:22:30.788 ] 00:22:30.788 }' 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:30.788 17:34:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:30.788 [2024-07-15 17:34:42.029265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:30.788 [2024-07-15 17:34:42.032088] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2108e80 00:22:30.788 [2024-07-15 17:34:42.033693] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:30.788 17:34:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:31.827 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:31.827 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:31.827 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:31.827 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:31.827 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:31.827 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.827 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.087 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:32.087 "name": "raid_bdev1", 00:22:32.087 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:32.087 "strip_size_kb": 0, 00:22:32.087 "state": "online", 00:22:32.087 "raid_level": "raid1", 00:22:32.087 "superblock": true, 00:22:32.087 "num_base_bdevs": 4, 00:22:32.087 "num_base_bdevs_discovered": 4, 00:22:32.087 "num_base_bdevs_operational": 4, 00:22:32.087 "process": { 00:22:32.087 "type": "rebuild", 00:22:32.087 "target": "spare", 00:22:32.087 "progress": { 00:22:32.087 "blocks": 22528, 00:22:32.087 "percent": 35 00:22:32.087 } 00:22:32.087 }, 00:22:32.087 "base_bdevs_list": [ 00:22:32.087 { 00:22:32.087 "name": "spare", 00:22:32.087 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:32.087 "is_configured": true, 00:22:32.087 "data_offset": 2048, 00:22:32.087 "data_size": 63488 00:22:32.087 }, 00:22:32.087 { 00:22:32.087 "name": "BaseBdev2", 00:22:32.087 "uuid": "5b68c79a-7eb5-5003-9bfd-296d49081d24", 00:22:32.087 "is_configured": true, 00:22:32.087 "data_offset": 2048, 00:22:32.087 "data_size": 63488 00:22:32.087 }, 00:22:32.087 { 00:22:32.087 "name": "BaseBdev3", 00:22:32.087 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:32.087 "is_configured": true, 00:22:32.087 "data_offset": 2048, 00:22:32.087 "data_size": 63488 00:22:32.087 }, 00:22:32.087 { 00:22:32.087 "name": "BaseBdev4", 00:22:32.087 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:32.087 "is_configured": true, 00:22:32.087 "data_offset": 2048, 00:22:32.087 "data_size": 63488 00:22:32.087 } 00:22:32.087 ] 00:22:32.087 }' 00:22:32.087 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:32.087 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:32.087 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:32.087 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:32.087 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:32.347 [2024-07-15 17:34:43.502446] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:32.347 [2024-07-15 17:34:43.542534] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:32.347 [2024-07-15 17:34:43.542569] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:32.347 [2024-07-15 17:34:43.542580] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:32.347 [2024-07-15 17:34:43.542585] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.348 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.608 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.608 "name": "raid_bdev1", 00:22:32.608 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:32.608 "strip_size_kb": 0, 00:22:32.608 "state": "online", 00:22:32.608 "raid_level": "raid1", 00:22:32.608 "superblock": true, 00:22:32.608 "num_base_bdevs": 4, 00:22:32.608 "num_base_bdevs_discovered": 3, 00:22:32.608 "num_base_bdevs_operational": 3, 00:22:32.608 "base_bdevs_list": [ 00:22:32.608 { 00:22:32.608 "name": null, 00:22:32.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.608 "is_configured": false, 00:22:32.608 "data_offset": 2048, 00:22:32.608 "data_size": 63488 00:22:32.608 }, 00:22:32.608 { 00:22:32.608 "name": "BaseBdev2", 00:22:32.608 "uuid": "5b68c79a-7eb5-5003-9bfd-296d49081d24", 00:22:32.608 "is_configured": true, 00:22:32.608 "data_offset": 2048, 00:22:32.608 "data_size": 63488 00:22:32.608 }, 00:22:32.608 { 00:22:32.608 "name": "BaseBdev3", 00:22:32.608 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:32.608 "is_configured": true, 00:22:32.608 "data_offset": 2048, 00:22:32.608 "data_size": 63488 00:22:32.608 }, 00:22:32.608 { 00:22:32.608 "name": "BaseBdev4", 00:22:32.608 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:32.608 "is_configured": true, 00:22:32.608 "data_offset": 2048, 00:22:32.608 "data_size": 63488 00:22:32.608 } 00:22:32.608 ] 00:22:32.608 }' 00:22:32.608 17:34:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.608 17:34:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:33.178 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:33.178 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:33.178 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:33.178 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:33.178 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:33.178 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.178 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.178 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:33.178 "name": "raid_bdev1", 00:22:33.178 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:33.178 "strip_size_kb": 0, 00:22:33.178 "state": "online", 00:22:33.178 "raid_level": "raid1", 00:22:33.178 "superblock": true, 00:22:33.178 "num_base_bdevs": 4, 00:22:33.178 "num_base_bdevs_discovered": 3, 00:22:33.178 "num_base_bdevs_operational": 3, 00:22:33.178 "base_bdevs_list": [ 00:22:33.178 { 00:22:33.178 "name": null, 00:22:33.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.178 "is_configured": false, 00:22:33.178 "data_offset": 2048, 00:22:33.178 "data_size": 63488 00:22:33.178 }, 00:22:33.178 { 00:22:33.178 "name": "BaseBdev2", 00:22:33.178 "uuid": "5b68c79a-7eb5-5003-9bfd-296d49081d24", 00:22:33.178 "is_configured": true, 00:22:33.178 "data_offset": 2048, 00:22:33.178 "data_size": 63488 00:22:33.178 }, 00:22:33.178 { 00:22:33.178 "name": "BaseBdev3", 00:22:33.178 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:33.178 "is_configured": true, 00:22:33.178 "data_offset": 2048, 00:22:33.178 "data_size": 63488 00:22:33.178 }, 00:22:33.178 { 00:22:33.178 "name": "BaseBdev4", 00:22:33.178 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:33.178 "is_configured": true, 00:22:33.178 "data_offset": 2048, 00:22:33.178 "data_size": 63488 00:22:33.178 } 00:22:33.178 ] 00:22:33.178 }' 00:22:33.178 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:33.438 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:33.438 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:33.438 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:33.438 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:33.438 [2024-07-15 17:34:44.717548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:33.438 [2024-07-15 17:34:44.720414] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22a3890 00:22:33.438 [2024-07-15 17:34:44.721584] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:33.438 17:34:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:34.820 "name": "raid_bdev1", 00:22:34.820 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:34.820 "strip_size_kb": 0, 00:22:34.820 "state": "online", 00:22:34.820 "raid_level": "raid1", 00:22:34.820 "superblock": true, 00:22:34.820 "num_base_bdevs": 4, 00:22:34.820 "num_base_bdevs_discovered": 4, 00:22:34.820 "num_base_bdevs_operational": 4, 00:22:34.820 "process": { 00:22:34.820 "type": "rebuild", 00:22:34.820 "target": "spare", 00:22:34.820 "progress": { 00:22:34.820 "blocks": 22528, 00:22:34.820 "percent": 35 00:22:34.820 } 00:22:34.820 }, 00:22:34.820 "base_bdevs_list": [ 00:22:34.820 { 00:22:34.820 "name": "spare", 00:22:34.820 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:34.820 "is_configured": true, 00:22:34.820 "data_offset": 2048, 00:22:34.820 "data_size": 63488 00:22:34.820 }, 00:22:34.820 { 00:22:34.820 "name": "BaseBdev2", 00:22:34.820 "uuid": "5b68c79a-7eb5-5003-9bfd-296d49081d24", 00:22:34.820 "is_configured": true, 00:22:34.820 "data_offset": 2048, 00:22:34.820 "data_size": 63488 00:22:34.820 }, 00:22:34.820 { 00:22:34.820 "name": "BaseBdev3", 00:22:34.820 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:34.820 "is_configured": true, 00:22:34.820 "data_offset": 2048, 00:22:34.820 "data_size": 63488 00:22:34.820 }, 00:22:34.820 { 00:22:34.820 "name": "BaseBdev4", 00:22:34.820 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:34.820 "is_configured": true, 00:22:34.820 "data_offset": 2048, 00:22:34.820 "data_size": 63488 00:22:34.820 } 00:22:34.820 ] 00:22:34.820 }' 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:34.820 17:34:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:34.820 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:34.820 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:34.820 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:34.820 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:34.820 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:22:34.820 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:34.820 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:22:34.820 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:35.390 [2024-07-15 17:34:46.536566] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:35.650 [2024-07-15 17:34:46.733071] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x22a3890 00:22:35.650 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:22:35.650 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:22:35.650 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:35.650 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:35.650 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:35.650 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:35.650 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:35.650 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.650 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.910 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:35.910 "name": "raid_bdev1", 00:22:35.910 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:35.910 "strip_size_kb": 0, 00:22:35.910 "state": "online", 00:22:35.910 "raid_level": "raid1", 00:22:35.910 "superblock": true, 00:22:35.910 "num_base_bdevs": 4, 00:22:35.910 "num_base_bdevs_discovered": 3, 00:22:35.910 "num_base_bdevs_operational": 3, 00:22:35.910 "process": { 00:22:35.910 "type": "rebuild", 00:22:35.910 "target": "spare", 00:22:35.910 "progress": { 00:22:35.910 "blocks": 43008, 00:22:35.910 "percent": 67 00:22:35.910 } 00:22:35.910 }, 00:22:35.910 "base_bdevs_list": [ 00:22:35.910 { 00:22:35.910 "name": "spare", 00:22:35.910 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:35.910 "is_configured": true, 00:22:35.910 "data_offset": 2048, 00:22:35.910 "data_size": 63488 00:22:35.910 }, 00:22:35.910 { 00:22:35.910 "name": null, 00:22:35.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.910 "is_configured": false, 00:22:35.910 "data_offset": 2048, 00:22:35.910 "data_size": 63488 00:22:35.910 }, 00:22:35.910 { 00:22:35.910 "name": "BaseBdev3", 00:22:35.910 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:35.910 "is_configured": true, 00:22:35.910 "data_offset": 2048, 00:22:35.910 "data_size": 63488 00:22:35.910 }, 00:22:35.910 { 00:22:35.910 "name": "BaseBdev4", 00:22:35.910 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:35.910 "is_configured": true, 00:22:35.910 "data_offset": 2048, 00:22:35.910 "data_size": 63488 00:22:35.910 } 00:22:35.910 ] 00:22:35.910 }' 00:22:35.910 17:34:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=781 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.910 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.170 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:36.170 "name": "raid_bdev1", 00:22:36.170 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:36.170 "strip_size_kb": 0, 00:22:36.170 "state": "online", 00:22:36.170 "raid_level": "raid1", 00:22:36.170 "superblock": true, 00:22:36.170 "num_base_bdevs": 4, 00:22:36.170 "num_base_bdevs_discovered": 3, 00:22:36.170 "num_base_bdevs_operational": 3, 00:22:36.170 "process": { 00:22:36.170 "type": "rebuild", 00:22:36.170 "target": "spare", 00:22:36.170 "progress": { 00:22:36.170 "blocks": 47104, 00:22:36.170 "percent": 74 00:22:36.170 } 00:22:36.170 }, 00:22:36.170 "base_bdevs_list": [ 00:22:36.170 { 00:22:36.170 "name": "spare", 00:22:36.170 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:36.170 "is_configured": true, 00:22:36.170 "data_offset": 2048, 00:22:36.170 "data_size": 63488 00:22:36.170 }, 00:22:36.170 { 00:22:36.170 "name": null, 00:22:36.170 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.170 "is_configured": false, 00:22:36.170 "data_offset": 2048, 00:22:36.170 "data_size": 63488 00:22:36.170 }, 00:22:36.170 { 00:22:36.170 "name": "BaseBdev3", 00:22:36.170 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:36.170 "is_configured": true, 00:22:36.170 "data_offset": 2048, 00:22:36.170 "data_size": 63488 00:22:36.170 }, 00:22:36.170 { 00:22:36.170 "name": "BaseBdev4", 00:22:36.170 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:36.170 "is_configured": true, 00:22:36.170 "data_offset": 2048, 00:22:36.170 "data_size": 63488 00:22:36.170 } 00:22:36.170 ] 00:22:36.170 }' 00:22:36.170 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:36.170 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:36.170 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:36.170 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:36.170 17:34:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:36.739 [2024-07-15 17:34:47.940108] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:36.739 [2024-07-15 17:34:47.940151] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:36.739 [2024-07-15 17:34:47.940227] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:37.309 "name": "raid_bdev1", 00:22:37.309 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:37.309 "strip_size_kb": 0, 00:22:37.309 "state": "online", 00:22:37.309 "raid_level": "raid1", 00:22:37.309 "superblock": true, 00:22:37.309 "num_base_bdevs": 4, 00:22:37.309 "num_base_bdevs_discovered": 3, 00:22:37.309 "num_base_bdevs_operational": 3, 00:22:37.309 "base_bdevs_list": [ 00:22:37.309 { 00:22:37.309 "name": "spare", 00:22:37.309 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:37.309 "is_configured": true, 00:22:37.309 "data_offset": 2048, 00:22:37.309 "data_size": 63488 00:22:37.309 }, 00:22:37.309 { 00:22:37.309 "name": null, 00:22:37.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.309 "is_configured": false, 00:22:37.309 "data_offset": 2048, 00:22:37.309 "data_size": 63488 00:22:37.309 }, 00:22:37.309 { 00:22:37.309 "name": "BaseBdev3", 00:22:37.309 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:37.309 "is_configured": true, 00:22:37.309 "data_offset": 2048, 00:22:37.309 "data_size": 63488 00:22:37.309 }, 00:22:37.309 { 00:22:37.309 "name": "BaseBdev4", 00:22:37.309 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:37.309 "is_configured": true, 00:22:37.309 "data_offset": 2048, 00:22:37.309 "data_size": 63488 00:22:37.309 } 00:22:37.309 ] 00:22:37.309 }' 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:37.309 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:37.570 "name": "raid_bdev1", 00:22:37.570 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:37.570 "strip_size_kb": 0, 00:22:37.570 "state": "online", 00:22:37.570 "raid_level": "raid1", 00:22:37.570 "superblock": true, 00:22:37.570 "num_base_bdevs": 4, 00:22:37.570 "num_base_bdevs_discovered": 3, 00:22:37.570 "num_base_bdevs_operational": 3, 00:22:37.570 "base_bdevs_list": [ 00:22:37.570 { 00:22:37.570 "name": "spare", 00:22:37.570 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:37.570 "is_configured": true, 00:22:37.570 "data_offset": 2048, 00:22:37.570 "data_size": 63488 00:22:37.570 }, 00:22:37.570 { 00:22:37.570 "name": null, 00:22:37.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.570 "is_configured": false, 00:22:37.570 "data_offset": 2048, 00:22:37.570 "data_size": 63488 00:22:37.570 }, 00:22:37.570 { 00:22:37.570 "name": "BaseBdev3", 00:22:37.570 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:37.570 "is_configured": true, 00:22:37.570 "data_offset": 2048, 00:22:37.570 "data_size": 63488 00:22:37.570 }, 00:22:37.570 { 00:22:37.570 "name": "BaseBdev4", 00:22:37.570 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:37.570 "is_configured": true, 00:22:37.570 "data_offset": 2048, 00:22:37.570 "data_size": 63488 00:22:37.570 } 00:22:37.570 ] 00:22:37.570 }' 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:37.570 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.831 17:34:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.831 17:34:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.831 "name": "raid_bdev1", 00:22:37.831 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:37.831 "strip_size_kb": 0, 00:22:37.831 "state": "online", 00:22:37.831 "raid_level": "raid1", 00:22:37.831 "superblock": true, 00:22:37.831 "num_base_bdevs": 4, 00:22:37.831 "num_base_bdevs_discovered": 3, 00:22:37.831 "num_base_bdevs_operational": 3, 00:22:37.831 "base_bdevs_list": [ 00:22:37.831 { 00:22:37.831 "name": "spare", 00:22:37.831 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:37.831 "is_configured": true, 00:22:37.831 "data_offset": 2048, 00:22:37.831 "data_size": 63488 00:22:37.831 }, 00:22:37.831 { 00:22:37.831 "name": null, 00:22:37.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.831 "is_configured": false, 00:22:37.831 "data_offset": 2048, 00:22:37.831 "data_size": 63488 00:22:37.831 }, 00:22:37.831 { 00:22:37.831 "name": "BaseBdev3", 00:22:37.831 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:37.831 "is_configured": true, 00:22:37.831 "data_offset": 2048, 00:22:37.831 "data_size": 63488 00:22:37.831 }, 00:22:37.831 { 00:22:37.831 "name": "BaseBdev4", 00:22:37.831 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:37.831 "is_configured": true, 00:22:37.831 "data_offset": 2048, 00:22:37.831 "data_size": 63488 00:22:37.831 } 00:22:37.831 ] 00:22:37.831 }' 00:22:37.831 17:34:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.831 17:34:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:38.401 17:34:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:38.662 [2024-07-15 17:34:49.783251] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:38.662 [2024-07-15 17:34:49.783273] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:38.662 [2024-07-15 17:34:49.783315] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:38.662 [2024-07-15 17:34:49.783371] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:38.662 [2024-07-15 17:34:49.783378] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2103b30 name raid_bdev1, state offline 00:22:38.662 17:34:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.662 17:34:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:22:38.922 17:34:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:38.922 17:34:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:38.922 17:34:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:38.922 /dev/nbd0 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:38.922 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:39.182 1+0 records in 00:22:39.182 1+0 records out 00:22:39.182 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235658 s, 17.4 MB/s 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:39.182 /dev/nbd1 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:39.182 1+0 records in 00:22:39.182 1+0 records out 00:22:39.182 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283235 s, 14.5 MB/s 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:39.182 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:39.442 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:39.700 17:34:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:39.959 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:40.218 [2024-07-15 17:34:51.291594] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:40.218 [2024-07-15 17:34:51.291629] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.218 [2024-07-15 17:34:51.291642] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2103830 00:22:40.218 [2024-07-15 17:34:51.291649] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.218 [2024-07-15 17:34:51.293052] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.218 [2024-07-15 17:34:51.293076] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:40.218 [2024-07-15 17:34:51.293138] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:40.218 [2024-07-15 17:34:51.293159] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:40.218 [2024-07-15 17:34:51.293241] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:40.218 [2024-07-15 17:34:51.293302] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:40.218 spare 00:22:40.218 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.219 [2024-07-15 17:34:51.393597] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2101070 00:22:40.219 [2024-07-15 17:34:51.393606] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:40.219 [2024-07-15 17:34:51.393770] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b0e00 00:22:40.219 [2024-07-15 17:34:51.393888] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2101070 00:22:40.219 [2024-07-15 17:34:51.393893] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2101070 00:22:40.219 [2024-07-15 17:34:51.393971] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.219 "name": "raid_bdev1", 00:22:40.219 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:40.219 "strip_size_kb": 0, 00:22:40.219 "state": "online", 00:22:40.219 "raid_level": "raid1", 00:22:40.219 "superblock": true, 00:22:40.219 "num_base_bdevs": 4, 00:22:40.219 "num_base_bdevs_discovered": 3, 00:22:40.219 "num_base_bdevs_operational": 3, 00:22:40.219 "base_bdevs_list": [ 00:22:40.219 { 00:22:40.219 "name": "spare", 00:22:40.219 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:40.219 "is_configured": true, 00:22:40.219 "data_offset": 2048, 00:22:40.219 "data_size": 63488 00:22:40.219 }, 00:22:40.219 { 00:22:40.219 "name": null, 00:22:40.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.219 "is_configured": false, 00:22:40.219 "data_offset": 2048, 00:22:40.219 "data_size": 63488 00:22:40.219 }, 00:22:40.219 { 00:22:40.219 "name": "BaseBdev3", 00:22:40.219 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:40.219 "is_configured": true, 00:22:40.219 "data_offset": 2048, 00:22:40.219 "data_size": 63488 00:22:40.219 }, 00:22:40.219 { 00:22:40.219 "name": "BaseBdev4", 00:22:40.219 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:40.219 "is_configured": true, 00:22:40.219 "data_offset": 2048, 00:22:40.219 "data_size": 63488 00:22:40.219 } 00:22:40.219 ] 00:22:40.219 }' 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.219 17:34:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:40.786 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:40.786 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:40.786 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:40.786 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:40.786 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:40.786 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.786 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.044 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:41.044 "name": "raid_bdev1", 00:22:41.044 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:41.044 "strip_size_kb": 0, 00:22:41.044 "state": "online", 00:22:41.044 "raid_level": "raid1", 00:22:41.044 "superblock": true, 00:22:41.044 "num_base_bdevs": 4, 00:22:41.044 "num_base_bdevs_discovered": 3, 00:22:41.044 "num_base_bdevs_operational": 3, 00:22:41.044 "base_bdevs_list": [ 00:22:41.044 { 00:22:41.044 "name": "spare", 00:22:41.044 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:41.044 "is_configured": true, 00:22:41.044 "data_offset": 2048, 00:22:41.045 "data_size": 63488 00:22:41.045 }, 00:22:41.045 { 00:22:41.045 "name": null, 00:22:41.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.045 "is_configured": false, 00:22:41.045 "data_offset": 2048, 00:22:41.045 "data_size": 63488 00:22:41.045 }, 00:22:41.045 { 00:22:41.045 "name": "BaseBdev3", 00:22:41.045 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:41.045 "is_configured": true, 00:22:41.045 "data_offset": 2048, 00:22:41.045 "data_size": 63488 00:22:41.045 }, 00:22:41.045 { 00:22:41.045 "name": "BaseBdev4", 00:22:41.045 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:41.045 "is_configured": true, 00:22:41.045 "data_offset": 2048, 00:22:41.045 "data_size": 63488 00:22:41.045 } 00:22:41.045 ] 00:22:41.045 }' 00:22:41.045 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:41.045 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:41.045 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:41.045 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:41.304 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:41.304 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.304 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:41.304 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:41.564 [2024-07-15 17:34:52.683209] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.564 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.824 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.824 "name": "raid_bdev1", 00:22:41.824 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:41.824 "strip_size_kb": 0, 00:22:41.824 "state": "online", 00:22:41.824 "raid_level": "raid1", 00:22:41.824 "superblock": true, 00:22:41.824 "num_base_bdevs": 4, 00:22:41.824 "num_base_bdevs_discovered": 2, 00:22:41.824 "num_base_bdevs_operational": 2, 00:22:41.824 "base_bdevs_list": [ 00:22:41.824 { 00:22:41.824 "name": null, 00:22:41.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.824 "is_configured": false, 00:22:41.824 "data_offset": 2048, 00:22:41.824 "data_size": 63488 00:22:41.824 }, 00:22:41.824 { 00:22:41.824 "name": null, 00:22:41.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.824 "is_configured": false, 00:22:41.824 "data_offset": 2048, 00:22:41.824 "data_size": 63488 00:22:41.824 }, 00:22:41.824 { 00:22:41.824 "name": "BaseBdev3", 00:22:41.824 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:41.824 "is_configured": true, 00:22:41.824 "data_offset": 2048, 00:22:41.824 "data_size": 63488 00:22:41.824 }, 00:22:41.824 { 00:22:41.824 "name": "BaseBdev4", 00:22:41.824 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:41.824 "is_configured": true, 00:22:41.824 "data_offset": 2048, 00:22:41.824 "data_size": 63488 00:22:41.824 } 00:22:41.824 ] 00:22:41.824 }' 00:22:41.824 17:34:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.824 17:34:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:42.395 17:34:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:42.395 [2024-07-15 17:34:53.545386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:42.395 [2024-07-15 17:34:53.545494] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:42.395 [2024-07-15 17:34:53.545504] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:42.395 [2024-07-15 17:34:53.545523] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:42.395 [2024-07-15 17:34:53.548260] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2103ac0 00:22:42.395 [2024-07-15 17:34:53.549891] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:42.395 17:34:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:43.336 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:43.336 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:43.336 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:43.336 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:43.336 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:43.336 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.336 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.596 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:43.596 "name": "raid_bdev1", 00:22:43.596 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:43.596 "strip_size_kb": 0, 00:22:43.596 "state": "online", 00:22:43.596 "raid_level": "raid1", 00:22:43.596 "superblock": true, 00:22:43.596 "num_base_bdevs": 4, 00:22:43.596 "num_base_bdevs_discovered": 3, 00:22:43.596 "num_base_bdevs_operational": 3, 00:22:43.596 "process": { 00:22:43.596 "type": "rebuild", 00:22:43.596 "target": "spare", 00:22:43.596 "progress": { 00:22:43.596 "blocks": 22528, 00:22:43.596 "percent": 35 00:22:43.596 } 00:22:43.596 }, 00:22:43.596 "base_bdevs_list": [ 00:22:43.596 { 00:22:43.596 "name": "spare", 00:22:43.596 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:43.596 "is_configured": true, 00:22:43.596 "data_offset": 2048, 00:22:43.596 "data_size": 63488 00:22:43.596 }, 00:22:43.596 { 00:22:43.596 "name": null, 00:22:43.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:43.596 "is_configured": false, 00:22:43.596 "data_offset": 2048, 00:22:43.596 "data_size": 63488 00:22:43.596 }, 00:22:43.596 { 00:22:43.596 "name": "BaseBdev3", 00:22:43.596 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:43.596 "is_configured": true, 00:22:43.596 "data_offset": 2048, 00:22:43.596 "data_size": 63488 00:22:43.596 }, 00:22:43.596 { 00:22:43.596 "name": "BaseBdev4", 00:22:43.596 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:43.596 "is_configured": true, 00:22:43.596 "data_offset": 2048, 00:22:43.596 "data_size": 63488 00:22:43.596 } 00:22:43.596 ] 00:22:43.596 }' 00:22:43.596 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:43.596 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:43.596 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:43.596 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:43.596 17:34:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:43.857 [2024-07-15 17:34:54.998339] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:43.857 [2024-07-15 17:34:55.058761] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:43.857 [2024-07-15 17:34:55.058790] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:43.857 [2024-07-15 17:34:55.058800] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:43.857 [2024-07-15 17:34:55.058804] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.857 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.117 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.117 "name": "raid_bdev1", 00:22:44.117 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:44.117 "strip_size_kb": 0, 00:22:44.117 "state": "online", 00:22:44.117 "raid_level": "raid1", 00:22:44.117 "superblock": true, 00:22:44.117 "num_base_bdevs": 4, 00:22:44.117 "num_base_bdevs_discovered": 2, 00:22:44.117 "num_base_bdevs_operational": 2, 00:22:44.117 "base_bdevs_list": [ 00:22:44.117 { 00:22:44.117 "name": null, 00:22:44.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.117 "is_configured": false, 00:22:44.117 "data_offset": 2048, 00:22:44.117 "data_size": 63488 00:22:44.117 }, 00:22:44.117 { 00:22:44.117 "name": null, 00:22:44.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.117 "is_configured": false, 00:22:44.117 "data_offset": 2048, 00:22:44.117 "data_size": 63488 00:22:44.117 }, 00:22:44.117 { 00:22:44.117 "name": "BaseBdev3", 00:22:44.117 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:44.117 "is_configured": true, 00:22:44.117 "data_offset": 2048, 00:22:44.117 "data_size": 63488 00:22:44.117 }, 00:22:44.117 { 00:22:44.117 "name": "BaseBdev4", 00:22:44.117 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:44.117 "is_configured": true, 00:22:44.117 "data_offset": 2048, 00:22:44.117 "data_size": 63488 00:22:44.117 } 00:22:44.117 ] 00:22:44.117 }' 00:22:44.117 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.117 17:34:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:44.688 17:34:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:44.949 [2024-07-15 17:34:56.036997] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:44.949 [2024-07-15 17:34:56.037033] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:44.949 [2024-07-15 17:34:56.037046] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b0c80 00:22:44.949 [2024-07-15 17:34:56.037053] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:44.949 [2024-07-15 17:34:56.037364] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:44.949 [2024-07-15 17:34:56.037376] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:44.949 [2024-07-15 17:34:56.037438] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:44.949 [2024-07-15 17:34:56.037445] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:44.949 [2024-07-15 17:34:56.037451] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:44.949 [2024-07-15 17:34:56.037463] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:44.949 [2024-07-15 17:34:56.040166] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x210ab50 00:22:44.949 [2024-07-15 17:34:56.041327] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:44.949 spare 00:22:44.949 17:34:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:45.892 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:45.892 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:45.892 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:45.892 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:45.892 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:45.892 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.892 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.153 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.153 "name": "raid_bdev1", 00:22:46.153 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:46.153 "strip_size_kb": 0, 00:22:46.153 "state": "online", 00:22:46.153 "raid_level": "raid1", 00:22:46.153 "superblock": true, 00:22:46.153 "num_base_bdevs": 4, 00:22:46.153 "num_base_bdevs_discovered": 3, 00:22:46.153 "num_base_bdevs_operational": 3, 00:22:46.153 "process": { 00:22:46.153 "type": "rebuild", 00:22:46.153 "target": "spare", 00:22:46.153 "progress": { 00:22:46.153 "blocks": 22528, 00:22:46.153 "percent": 35 00:22:46.153 } 00:22:46.153 }, 00:22:46.153 "base_bdevs_list": [ 00:22:46.153 { 00:22:46.153 "name": "spare", 00:22:46.153 "uuid": "56fc3d8e-eef3-5f1c-81f4-2693ff07b1a5", 00:22:46.153 "is_configured": true, 00:22:46.153 "data_offset": 2048, 00:22:46.153 "data_size": 63488 00:22:46.153 }, 00:22:46.153 { 00:22:46.153 "name": null, 00:22:46.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:46.153 "is_configured": false, 00:22:46.153 "data_offset": 2048, 00:22:46.153 "data_size": 63488 00:22:46.153 }, 00:22:46.153 { 00:22:46.153 "name": "BaseBdev3", 00:22:46.153 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:46.153 "is_configured": true, 00:22:46.153 "data_offset": 2048, 00:22:46.153 "data_size": 63488 00:22:46.153 }, 00:22:46.153 { 00:22:46.153 "name": "BaseBdev4", 00:22:46.153 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:46.153 "is_configured": true, 00:22:46.153 "data_offset": 2048, 00:22:46.153 "data_size": 63488 00:22:46.153 } 00:22:46.153 ] 00:22:46.153 }' 00:22:46.153 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:46.153 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:46.153 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:46.153 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:46.153 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:46.417 [2024-07-15 17:34:57.530149] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:46.417 [2024-07-15 17:34:57.550152] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:46.417 [2024-07-15 17:34:57.550181] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:46.417 [2024-07-15 17:34:57.550191] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:46.417 [2024-07-15 17:34:57.550195] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.417 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.715 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:46.715 "name": "raid_bdev1", 00:22:46.715 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:46.715 "strip_size_kb": 0, 00:22:46.715 "state": "online", 00:22:46.715 "raid_level": "raid1", 00:22:46.715 "superblock": true, 00:22:46.715 "num_base_bdevs": 4, 00:22:46.715 "num_base_bdevs_discovered": 2, 00:22:46.715 "num_base_bdevs_operational": 2, 00:22:46.715 "base_bdevs_list": [ 00:22:46.715 { 00:22:46.715 "name": null, 00:22:46.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:46.715 "is_configured": false, 00:22:46.715 "data_offset": 2048, 00:22:46.715 "data_size": 63488 00:22:46.715 }, 00:22:46.715 { 00:22:46.715 "name": null, 00:22:46.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:46.715 "is_configured": false, 00:22:46.715 "data_offset": 2048, 00:22:46.715 "data_size": 63488 00:22:46.715 }, 00:22:46.715 { 00:22:46.715 "name": "BaseBdev3", 00:22:46.715 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:46.715 "is_configured": true, 00:22:46.715 "data_offset": 2048, 00:22:46.715 "data_size": 63488 00:22:46.715 }, 00:22:46.715 { 00:22:46.715 "name": "BaseBdev4", 00:22:46.715 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:46.715 "is_configured": true, 00:22:46.715 "data_offset": 2048, 00:22:46.715 "data_size": 63488 00:22:46.715 } 00:22:46.715 ] 00:22:46.715 }' 00:22:46.715 17:34:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:46.715 17:34:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:47.287 "name": "raid_bdev1", 00:22:47.287 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:47.287 "strip_size_kb": 0, 00:22:47.287 "state": "online", 00:22:47.287 "raid_level": "raid1", 00:22:47.287 "superblock": true, 00:22:47.287 "num_base_bdevs": 4, 00:22:47.287 "num_base_bdevs_discovered": 2, 00:22:47.287 "num_base_bdevs_operational": 2, 00:22:47.287 "base_bdevs_list": [ 00:22:47.287 { 00:22:47.287 "name": null, 00:22:47.287 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:47.287 "is_configured": false, 00:22:47.287 "data_offset": 2048, 00:22:47.287 "data_size": 63488 00:22:47.287 }, 00:22:47.287 { 00:22:47.287 "name": null, 00:22:47.287 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:47.287 "is_configured": false, 00:22:47.287 "data_offset": 2048, 00:22:47.287 "data_size": 63488 00:22:47.287 }, 00:22:47.287 { 00:22:47.287 "name": "BaseBdev3", 00:22:47.287 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:47.287 "is_configured": true, 00:22:47.287 "data_offset": 2048, 00:22:47.287 "data_size": 63488 00:22:47.287 }, 00:22:47.287 { 00:22:47.287 "name": "BaseBdev4", 00:22:47.287 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:47.287 "is_configured": true, 00:22:47.287 "data_offset": 2048, 00:22:47.287 "data_size": 63488 00:22:47.287 } 00:22:47.287 ] 00:22:47.287 }' 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:47.287 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:47.547 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:47.547 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:47.547 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:47.806 [2024-07-15 17:34:58.981801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:47.806 [2024-07-15 17:34:58.981836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.806 [2024-07-15 17:34:58.981848] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2109f60 00:22:47.806 [2024-07-15 17:34:58.981854] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.806 [2024-07-15 17:34:58.982138] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.806 [2024-07-15 17:34:58.982150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:47.806 [2024-07-15 17:34:58.982193] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:47.806 [2024-07-15 17:34:58.982202] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:47.806 [2024-07-15 17:34:58.982207] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:47.806 BaseBdev1 00:22:47.806 17:34:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:48.745 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.746 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.005 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.005 "name": "raid_bdev1", 00:22:49.005 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:49.005 "strip_size_kb": 0, 00:22:49.005 "state": "online", 00:22:49.005 "raid_level": "raid1", 00:22:49.005 "superblock": true, 00:22:49.005 "num_base_bdevs": 4, 00:22:49.005 "num_base_bdevs_discovered": 2, 00:22:49.005 "num_base_bdevs_operational": 2, 00:22:49.005 "base_bdevs_list": [ 00:22:49.005 { 00:22:49.005 "name": null, 00:22:49.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.005 "is_configured": false, 00:22:49.005 "data_offset": 2048, 00:22:49.005 "data_size": 63488 00:22:49.005 }, 00:22:49.005 { 00:22:49.005 "name": null, 00:22:49.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.005 "is_configured": false, 00:22:49.005 "data_offset": 2048, 00:22:49.005 "data_size": 63488 00:22:49.005 }, 00:22:49.005 { 00:22:49.005 "name": "BaseBdev3", 00:22:49.005 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:49.005 "is_configured": true, 00:22:49.005 "data_offset": 2048, 00:22:49.005 "data_size": 63488 00:22:49.005 }, 00:22:49.005 { 00:22:49.005 "name": "BaseBdev4", 00:22:49.005 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:49.005 "is_configured": true, 00:22:49.005 "data_offset": 2048, 00:22:49.005 "data_size": 63488 00:22:49.005 } 00:22:49.005 ] 00:22:49.005 }' 00:22:49.005 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.005 17:35:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:49.573 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:49.573 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:49.573 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:49.573 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:49.573 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:49.573 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.573 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.833 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:49.833 "name": "raid_bdev1", 00:22:49.833 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:49.833 "strip_size_kb": 0, 00:22:49.833 "state": "online", 00:22:49.833 "raid_level": "raid1", 00:22:49.833 "superblock": true, 00:22:49.833 "num_base_bdevs": 4, 00:22:49.833 "num_base_bdevs_discovered": 2, 00:22:49.833 "num_base_bdevs_operational": 2, 00:22:49.833 "base_bdevs_list": [ 00:22:49.833 { 00:22:49.833 "name": null, 00:22:49.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.833 "is_configured": false, 00:22:49.833 "data_offset": 2048, 00:22:49.833 "data_size": 63488 00:22:49.833 }, 00:22:49.833 { 00:22:49.833 "name": null, 00:22:49.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.833 "is_configured": false, 00:22:49.833 "data_offset": 2048, 00:22:49.833 "data_size": 63488 00:22:49.833 }, 00:22:49.833 { 00:22:49.833 "name": "BaseBdev3", 00:22:49.833 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:49.833 "is_configured": true, 00:22:49.833 "data_offset": 2048, 00:22:49.833 "data_size": 63488 00:22:49.833 }, 00:22:49.833 { 00:22:49.833 "name": "BaseBdev4", 00:22:49.833 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:49.833 "is_configured": true, 00:22:49.833 "data_offset": 2048, 00:22:49.833 "data_size": 63488 00:22:49.833 } 00:22:49.833 ] 00:22:49.833 }' 00:22:49.833 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:49.833 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:49.833 17:35:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:49.833 17:35:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:49.833 17:35:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:49.833 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:22:49.833 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:49.834 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:49.834 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:49.834 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:49.834 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:49.834 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:49.834 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:49.834 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:49.834 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:49.834 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:50.094 [2024-07-15 17:35:01.199429] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:50.094 [2024-07-15 17:35:01.199533] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:50.094 [2024-07-15 17:35:01.199543] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:50.094 request: 00:22:50.094 { 00:22:50.094 "base_bdev": "BaseBdev1", 00:22:50.094 "raid_bdev": "raid_bdev1", 00:22:50.094 "method": "bdev_raid_add_base_bdev", 00:22:50.094 "req_id": 1 00:22:50.094 } 00:22:50.094 Got JSON-RPC error response 00:22:50.094 response: 00:22:50.094 { 00:22:50.094 "code": -22, 00:22:50.094 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:50.094 } 00:22:50.094 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:22:50.094 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:50.094 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:50.094 17:35:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:50.094 17:35:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.033 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.293 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:51.293 "name": "raid_bdev1", 00:22:51.293 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:51.293 "strip_size_kb": 0, 00:22:51.293 "state": "online", 00:22:51.293 "raid_level": "raid1", 00:22:51.293 "superblock": true, 00:22:51.293 "num_base_bdevs": 4, 00:22:51.293 "num_base_bdevs_discovered": 2, 00:22:51.293 "num_base_bdevs_operational": 2, 00:22:51.293 "base_bdevs_list": [ 00:22:51.293 { 00:22:51.293 "name": null, 00:22:51.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:51.293 "is_configured": false, 00:22:51.293 "data_offset": 2048, 00:22:51.293 "data_size": 63488 00:22:51.293 }, 00:22:51.293 { 00:22:51.293 "name": null, 00:22:51.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:51.293 "is_configured": false, 00:22:51.293 "data_offset": 2048, 00:22:51.293 "data_size": 63488 00:22:51.293 }, 00:22:51.293 { 00:22:51.293 "name": "BaseBdev3", 00:22:51.293 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:51.293 "is_configured": true, 00:22:51.293 "data_offset": 2048, 00:22:51.293 "data_size": 63488 00:22:51.293 }, 00:22:51.293 { 00:22:51.293 "name": "BaseBdev4", 00:22:51.293 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:51.293 "is_configured": true, 00:22:51.293 "data_offset": 2048, 00:22:51.293 "data_size": 63488 00:22:51.293 } 00:22:51.293 ] 00:22:51.293 }' 00:22:51.293 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:51.293 17:35:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:51.863 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:51.863 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:51.863 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:51.863 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:51.863 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:51.863 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.863 17:35:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.863 17:35:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:51.863 "name": "raid_bdev1", 00:22:51.863 "uuid": "80f5e90f-547f-45ab-8c6e-7f8f5a0fd460", 00:22:51.863 "strip_size_kb": 0, 00:22:51.863 "state": "online", 00:22:51.863 "raid_level": "raid1", 00:22:51.863 "superblock": true, 00:22:51.863 "num_base_bdevs": 4, 00:22:51.863 "num_base_bdevs_discovered": 2, 00:22:51.863 "num_base_bdevs_operational": 2, 00:22:51.863 "base_bdevs_list": [ 00:22:51.863 { 00:22:51.863 "name": null, 00:22:51.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:51.863 "is_configured": false, 00:22:51.863 "data_offset": 2048, 00:22:51.863 "data_size": 63488 00:22:51.863 }, 00:22:51.863 { 00:22:51.863 "name": null, 00:22:51.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:51.863 "is_configured": false, 00:22:51.863 "data_offset": 2048, 00:22:51.863 "data_size": 63488 00:22:51.863 }, 00:22:51.863 { 00:22:51.863 "name": "BaseBdev3", 00:22:51.863 "uuid": "18bf7a25-0888-5b45-bf1c-20508a8e3f67", 00:22:51.863 "is_configured": true, 00:22:51.863 "data_offset": 2048, 00:22:51.863 "data_size": 63488 00:22:51.863 }, 00:22:51.863 { 00:22:51.863 "name": "BaseBdev4", 00:22:51.863 "uuid": "934b0e8c-c8bf-5645-949e-6be0f5d62b97", 00:22:51.863 "is_configured": true, 00:22:51.863 "data_offset": 2048, 00:22:51.863 "data_size": 63488 00:22:51.863 } 00:22:51.863 ] 00:22:51.863 }' 00:22:51.863 17:35:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2877713 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2877713 ']' 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2877713 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2877713 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2877713' 00:22:52.126 killing process with pid 2877713 00:22:52.126 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2877713 00:22:52.126 Received shutdown signal, test time was about 60.000000 seconds 00:22:52.126 00:22:52.126 Latency(us) 00:22:52.126 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:52.126 =================================================================================================================== 00:22:52.126 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:52.126 [2024-07-15 17:35:03.282278] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:52.126 [2024-07-15 17:35:03.282351] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:52.126 [2024-07-15 17:35:03.282398] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:52.127 [2024-07-15 17:35:03.282405] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2101070 name raid_bdev1, state offline 00:22:52.127 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2877713 00:22:52.127 [2024-07-15 17:35:03.308968] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:22:52.387 00:22:52.387 real 0m36.061s 00:22:52.387 user 0m51.913s 00:22:52.387 sys 0m5.389s 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:52.387 ************************************ 00:22:52.387 END TEST raid_rebuild_test_sb 00:22:52.387 ************************************ 00:22:52.387 17:35:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:52.387 17:35:03 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:22:52.387 17:35:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:52.387 17:35:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:52.387 17:35:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:52.387 ************************************ 00:22:52.387 START TEST raid_rebuild_test_io 00:22:52.387 ************************************ 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2884211 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2884211 /var/tmp/spdk-raid.sock 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2884211 ']' 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:52.387 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:52.387 17:35:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:52.387 [2024-07-15 17:35:03.585332] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:22:52.387 [2024-07-15 17:35:03.585390] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2884211 ] 00:22:52.387 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:52.387 Zero copy mechanism will not be used. 00:22:52.387 [2024-07-15 17:35:03.674988] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:52.647 [2024-07-15 17:35:03.751634] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:52.648 [2024-07-15 17:35:03.793754] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:52.648 [2024-07-15 17:35:03.793778] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:53.216 17:35:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:53.216 17:35:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:22:53.216 17:35:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:53.216 17:35:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:53.476 BaseBdev1_malloc 00:22:53.476 17:35:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:53.736 [2024-07-15 17:35:04.788448] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:53.736 [2024-07-15 17:35:04.788482] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.736 [2024-07-15 17:35:04.788495] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2286d30 00:22:53.736 [2024-07-15 17:35:04.788501] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.736 [2024-07-15 17:35:04.789901] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.736 [2024-07-15 17:35:04.789921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:53.736 BaseBdev1 00:22:53.736 17:35:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:53.736 17:35:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:53.736 BaseBdev2_malloc 00:22:53.736 17:35:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:53.996 [2024-07-15 17:35:05.159411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:53.996 [2024-07-15 17:35:05.159439] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.996 [2024-07-15 17:35:05.159450] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2439c60 00:22:53.996 [2024-07-15 17:35:05.159456] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.996 [2024-07-15 17:35:05.160672] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.996 [2024-07-15 17:35:05.160691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:53.996 BaseBdev2 00:22:53.996 17:35:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:53.996 17:35:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:54.256 BaseBdev3_malloc 00:22:54.256 17:35:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:54.256 [2024-07-15 17:35:05.546314] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:54.256 [2024-07-15 17:35:05.546343] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.256 [2024-07-15 17:35:05.546354] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x241eb90 00:22:54.256 [2024-07-15 17:35:05.546360] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.256 [2024-07-15 17:35:05.547555] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.256 [2024-07-15 17:35:05.547574] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:54.256 BaseBdev3 00:22:54.515 17:35:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:54.515 17:35:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:54.515 BaseBdev4_malloc 00:22:54.515 17:35:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:54.775 [2024-07-15 17:35:05.933195] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:54.775 [2024-07-15 17:35:05.933223] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.775 [2024-07-15 17:35:05.933234] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22878c0 00:22:54.775 [2024-07-15 17:35:05.933240] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.775 [2024-07-15 17:35:05.934422] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.775 [2024-07-15 17:35:05.934440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:54.775 BaseBdev4 00:22:54.775 17:35:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:55.036 spare_malloc 00:22:55.036 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:55.036 spare_delay 00:22:55.036 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:55.297 [2024-07-15 17:35:06.496489] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:55.297 [2024-07-15 17:35:06.496518] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.297 [2024-07-15 17:35:06.496529] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x227fa80 00:22:55.297 [2024-07-15 17:35:06.496540] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.297 [2024-07-15 17:35:06.497742] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.297 [2024-07-15 17:35:06.497761] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:55.297 spare 00:22:55.297 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:55.558 [2024-07-15 17:35:06.684984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:55.558 [2024-07-15 17:35:06.685990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:55.558 [2024-07-15 17:35:06.686033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:55.558 [2024-07-15 17:35:06.686065] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:55.558 [2024-07-15 17:35:06.686124] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2280b30 00:22:55.558 [2024-07-15 17:35:06.686130] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:55.558 [2024-07-15 17:35:06.686287] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2284bf0 00:22:55.558 [2024-07-15 17:35:06.686401] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2280b30 00:22:55.558 [2024-07-15 17:35:06.686406] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2280b30 00:22:55.558 [2024-07-15 17:35:06.686486] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.558 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.820 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:55.820 "name": "raid_bdev1", 00:22:55.820 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:22:55.820 "strip_size_kb": 0, 00:22:55.820 "state": "online", 00:22:55.820 "raid_level": "raid1", 00:22:55.820 "superblock": false, 00:22:55.820 "num_base_bdevs": 4, 00:22:55.820 "num_base_bdevs_discovered": 4, 00:22:55.820 "num_base_bdevs_operational": 4, 00:22:55.820 "base_bdevs_list": [ 00:22:55.820 { 00:22:55.820 "name": "BaseBdev1", 00:22:55.820 "uuid": "7213591b-e283-59ce-a3af-5dc064757f2a", 00:22:55.820 "is_configured": true, 00:22:55.820 "data_offset": 0, 00:22:55.820 "data_size": 65536 00:22:55.820 }, 00:22:55.820 { 00:22:55.820 "name": "BaseBdev2", 00:22:55.820 "uuid": "3bfad638-8e86-5dff-899e-f89bbc850e70", 00:22:55.820 "is_configured": true, 00:22:55.820 "data_offset": 0, 00:22:55.820 "data_size": 65536 00:22:55.820 }, 00:22:55.820 { 00:22:55.820 "name": "BaseBdev3", 00:22:55.820 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:22:55.820 "is_configured": true, 00:22:55.820 "data_offset": 0, 00:22:55.820 "data_size": 65536 00:22:55.820 }, 00:22:55.820 { 00:22:55.820 "name": "BaseBdev4", 00:22:55.820 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:22:55.820 "is_configured": true, 00:22:55.820 "data_offset": 0, 00:22:55.820 "data_size": 65536 00:22:55.820 } 00:22:55.820 ] 00:22:55.820 }' 00:22:55.820 17:35:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:55.820 17:35:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:56.387 17:35:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:56.387 17:35:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:56.387 [2024-07-15 17:35:07.571455] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:56.387 17:35:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:56.387 17:35:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.387 17:35:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:56.646 17:35:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:56.647 17:35:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:22:56.647 17:35:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:56.647 17:35:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:56.647 [2024-07-15 17:35:07.865412] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22861c0 00:22:56.647 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:56.647 Zero copy mechanism will not be used. 00:22:56.647 Running I/O for 60 seconds... 00:22:56.906 [2024-07-15 17:35:07.979680] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:56.906 [2024-07-15 17:35:07.986102] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x22861c0 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.906 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.168 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:57.168 "name": "raid_bdev1", 00:22:57.168 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:22:57.168 "strip_size_kb": 0, 00:22:57.168 "state": "online", 00:22:57.168 "raid_level": "raid1", 00:22:57.168 "superblock": false, 00:22:57.168 "num_base_bdevs": 4, 00:22:57.168 "num_base_bdevs_discovered": 3, 00:22:57.168 "num_base_bdevs_operational": 3, 00:22:57.168 "base_bdevs_list": [ 00:22:57.168 { 00:22:57.168 "name": null, 00:22:57.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.168 "is_configured": false, 00:22:57.168 "data_offset": 0, 00:22:57.168 "data_size": 65536 00:22:57.168 }, 00:22:57.168 { 00:22:57.168 "name": "BaseBdev2", 00:22:57.168 "uuid": "3bfad638-8e86-5dff-899e-f89bbc850e70", 00:22:57.168 "is_configured": true, 00:22:57.168 "data_offset": 0, 00:22:57.168 "data_size": 65536 00:22:57.168 }, 00:22:57.168 { 00:22:57.168 "name": "BaseBdev3", 00:22:57.168 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:22:57.168 "is_configured": true, 00:22:57.168 "data_offset": 0, 00:22:57.168 "data_size": 65536 00:22:57.168 }, 00:22:57.168 { 00:22:57.168 "name": "BaseBdev4", 00:22:57.168 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:22:57.168 "is_configured": true, 00:22:57.168 "data_offset": 0, 00:22:57.168 "data_size": 65536 00:22:57.168 } 00:22:57.168 ] 00:22:57.168 }' 00:22:57.168 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:57.168 17:35:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:57.739 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:57.739 [2024-07-15 17:35:08.959524] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:57.739 17:35:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:57.739 [2024-07-15 17:35:08.994414] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22863f0 00:22:57.739 [2024-07-15 17:35:08.996194] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:57.999 [2024-07-15 17:35:09.118148] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:57.999 [2024-07-15 17:35:09.118935] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:58.258 [2024-07-15 17:35:09.320648] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:58.258 [2024-07-15 17:35:09.320755] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:58.518 [2024-07-15 17:35:09.598713] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:58.778 [2024-07-15 17:35:09.830733] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:58.778 17:35:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:58.778 17:35:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:58.778 17:35:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:58.778 17:35:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:58.778 17:35:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:58.778 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.778 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.039 [2024-07-15 17:35:10.196188] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:59.039 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:59.039 "name": "raid_bdev1", 00:22:59.039 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:22:59.039 "strip_size_kb": 0, 00:22:59.039 "state": "online", 00:22:59.039 "raid_level": "raid1", 00:22:59.039 "superblock": false, 00:22:59.039 "num_base_bdevs": 4, 00:22:59.039 "num_base_bdevs_discovered": 4, 00:22:59.039 "num_base_bdevs_operational": 4, 00:22:59.039 "process": { 00:22:59.039 "type": "rebuild", 00:22:59.039 "target": "spare", 00:22:59.039 "progress": { 00:22:59.039 "blocks": 14336, 00:22:59.039 "percent": 21 00:22:59.039 } 00:22:59.039 }, 00:22:59.039 "base_bdevs_list": [ 00:22:59.039 { 00:22:59.039 "name": "spare", 00:22:59.039 "uuid": "1f871d9a-6955-5cc6-856d-2f9f28be8695", 00:22:59.039 "is_configured": true, 00:22:59.039 "data_offset": 0, 00:22:59.039 "data_size": 65536 00:22:59.039 }, 00:22:59.039 { 00:22:59.039 "name": "BaseBdev2", 00:22:59.039 "uuid": "3bfad638-8e86-5dff-899e-f89bbc850e70", 00:22:59.039 "is_configured": true, 00:22:59.039 "data_offset": 0, 00:22:59.039 "data_size": 65536 00:22:59.039 }, 00:22:59.039 { 00:22:59.039 "name": "BaseBdev3", 00:22:59.039 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:22:59.039 "is_configured": true, 00:22:59.039 "data_offset": 0, 00:22:59.039 "data_size": 65536 00:22:59.039 }, 00:22:59.039 { 00:22:59.039 "name": "BaseBdev4", 00:22:59.039 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:22:59.039 "is_configured": true, 00:22:59.039 "data_offset": 0, 00:22:59.039 "data_size": 65536 00:22:59.039 } 00:22:59.039 ] 00:22:59.039 }' 00:22:59.039 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:59.039 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:59.039 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:59.039 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:59.039 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:59.299 [2024-07-15 17:35:10.499402] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:59.299 [2024-07-15 17:35:10.547177] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:59.559 [2024-07-15 17:35:10.647784] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:59.559 [2024-07-15 17:35:10.649227] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:59.559 [2024-07-15 17:35:10.649247] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:59.559 [2024-07-15 17:35:10.649253] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:59.559 [2024-07-15 17:35:10.672587] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x22861c0 00:22:59.559 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:59.559 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:59.559 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:59.559 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:59.559 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:59.559 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:59.560 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:59.560 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:59.560 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:59.560 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:59.560 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.560 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.820 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:59.820 "name": "raid_bdev1", 00:22:59.820 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:22:59.820 "strip_size_kb": 0, 00:22:59.820 "state": "online", 00:22:59.820 "raid_level": "raid1", 00:22:59.820 "superblock": false, 00:22:59.820 "num_base_bdevs": 4, 00:22:59.820 "num_base_bdevs_discovered": 3, 00:22:59.820 "num_base_bdevs_operational": 3, 00:22:59.820 "base_bdevs_list": [ 00:22:59.820 { 00:22:59.820 "name": null, 00:22:59.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.820 "is_configured": false, 00:22:59.820 "data_offset": 0, 00:22:59.820 "data_size": 65536 00:22:59.820 }, 00:22:59.820 { 00:22:59.820 "name": "BaseBdev2", 00:22:59.820 "uuid": "3bfad638-8e86-5dff-899e-f89bbc850e70", 00:22:59.820 "is_configured": true, 00:22:59.820 "data_offset": 0, 00:22:59.820 "data_size": 65536 00:22:59.820 }, 00:22:59.820 { 00:22:59.820 "name": "BaseBdev3", 00:22:59.820 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:22:59.820 "is_configured": true, 00:22:59.820 "data_offset": 0, 00:22:59.820 "data_size": 65536 00:22:59.820 }, 00:22:59.820 { 00:22:59.820 "name": "BaseBdev4", 00:22:59.820 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:22:59.820 "is_configured": true, 00:22:59.820 "data_offset": 0, 00:22:59.820 "data_size": 65536 00:22:59.820 } 00:22:59.820 ] 00:22:59.820 }' 00:22:59.820 17:35:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:59.820 17:35:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:00.759 17:35:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:00.759 17:35:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:00.759 17:35:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:00.759 17:35:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:00.759 17:35:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:00.759 17:35:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.759 17:35:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.759 17:35:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:00.759 "name": "raid_bdev1", 00:23:00.759 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:23:00.759 "strip_size_kb": 0, 00:23:00.759 "state": "online", 00:23:00.759 "raid_level": "raid1", 00:23:00.759 "superblock": false, 00:23:00.759 "num_base_bdevs": 4, 00:23:00.759 "num_base_bdevs_discovered": 3, 00:23:00.759 "num_base_bdevs_operational": 3, 00:23:00.759 "base_bdevs_list": [ 00:23:00.759 { 00:23:00.759 "name": null, 00:23:00.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:00.759 "is_configured": false, 00:23:00.759 "data_offset": 0, 00:23:00.759 "data_size": 65536 00:23:00.759 }, 00:23:00.759 { 00:23:00.759 "name": "BaseBdev2", 00:23:00.759 "uuid": "3bfad638-8e86-5dff-899e-f89bbc850e70", 00:23:00.759 "is_configured": true, 00:23:00.759 "data_offset": 0, 00:23:00.759 "data_size": 65536 00:23:00.759 }, 00:23:00.759 { 00:23:00.759 "name": "BaseBdev3", 00:23:00.759 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:23:00.759 "is_configured": true, 00:23:00.759 "data_offset": 0, 00:23:00.759 "data_size": 65536 00:23:00.759 }, 00:23:00.759 { 00:23:00.759 "name": "BaseBdev4", 00:23:00.759 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:23:00.759 "is_configured": true, 00:23:00.759 "data_offset": 0, 00:23:00.759 "data_size": 65536 00:23:00.759 } 00:23:00.759 ] 00:23:00.759 }' 00:23:00.759 17:35:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:00.759 17:35:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:00.759 17:35:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:01.021 17:35:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:01.021 17:35:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:01.021 [2024-07-15 17:35:12.272913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:01.317 17:35:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:01.317 [2024-07-15 17:35:12.336436] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x229ca60 00:23:01.317 [2024-07-15 17:35:12.337620] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:01.317 [2024-07-15 17:35:12.461526] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:01.317 [2024-07-15 17:35:12.462339] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:01.577 [2024-07-15 17:35:12.686958] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:01.577 [2024-07-15 17:35:12.687391] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:02.146 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:02.146 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:02.146 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:02.146 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:02.146 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:02.146 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.146 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:02.405 "name": "raid_bdev1", 00:23:02.405 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:23:02.405 "strip_size_kb": 0, 00:23:02.405 "state": "online", 00:23:02.405 "raid_level": "raid1", 00:23:02.405 "superblock": false, 00:23:02.405 "num_base_bdevs": 4, 00:23:02.405 "num_base_bdevs_discovered": 4, 00:23:02.405 "num_base_bdevs_operational": 4, 00:23:02.405 "process": { 00:23:02.405 "type": "rebuild", 00:23:02.405 "target": "spare", 00:23:02.405 "progress": { 00:23:02.405 "blocks": 16384, 00:23:02.405 "percent": 25 00:23:02.405 } 00:23:02.405 }, 00:23:02.405 "base_bdevs_list": [ 00:23:02.405 { 00:23:02.405 "name": "spare", 00:23:02.405 "uuid": "1f871d9a-6955-5cc6-856d-2f9f28be8695", 00:23:02.405 "is_configured": true, 00:23:02.405 "data_offset": 0, 00:23:02.405 "data_size": 65536 00:23:02.405 }, 00:23:02.405 { 00:23:02.405 "name": "BaseBdev2", 00:23:02.405 "uuid": "3bfad638-8e86-5dff-899e-f89bbc850e70", 00:23:02.405 "is_configured": true, 00:23:02.405 "data_offset": 0, 00:23:02.405 "data_size": 65536 00:23:02.405 }, 00:23:02.405 { 00:23:02.405 "name": "BaseBdev3", 00:23:02.405 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:23:02.405 "is_configured": true, 00:23:02.405 "data_offset": 0, 00:23:02.405 "data_size": 65536 00:23:02.405 }, 00:23:02.405 { 00:23:02.405 "name": "BaseBdev4", 00:23:02.405 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:23:02.405 "is_configured": true, 00:23:02.405 "data_offset": 0, 00:23:02.405 "data_size": 65536 00:23:02.405 } 00:23:02.405 ] 00:23:02.405 }' 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:02.405 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:02.664 [2024-07-15 17:35:13.704600] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:02.665 [2024-07-15 17:35:13.791358] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:02.665 [2024-07-15 17:35:13.821979] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:02.665 [2024-07-15 17:35:13.890025] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x22861c0 00:23:02.665 [2024-07-15 17:35:13.890043] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x229ca60 00:23:02.665 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:02.665 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:02.665 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:02.665 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:02.665 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:02.665 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:02.665 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:02.665 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.665 17:35:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:02.924 "name": "raid_bdev1", 00:23:02.924 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:23:02.924 "strip_size_kb": 0, 00:23:02.924 "state": "online", 00:23:02.924 "raid_level": "raid1", 00:23:02.924 "superblock": false, 00:23:02.924 "num_base_bdevs": 4, 00:23:02.924 "num_base_bdevs_discovered": 3, 00:23:02.924 "num_base_bdevs_operational": 3, 00:23:02.924 "process": { 00:23:02.924 "type": "rebuild", 00:23:02.924 "target": "spare", 00:23:02.924 "progress": { 00:23:02.924 "blocks": 24576, 00:23:02.924 "percent": 37 00:23:02.924 } 00:23:02.924 }, 00:23:02.924 "base_bdevs_list": [ 00:23:02.924 { 00:23:02.924 "name": "spare", 00:23:02.924 "uuid": "1f871d9a-6955-5cc6-856d-2f9f28be8695", 00:23:02.924 "is_configured": true, 00:23:02.924 "data_offset": 0, 00:23:02.924 "data_size": 65536 00:23:02.924 }, 00:23:02.924 { 00:23:02.924 "name": null, 00:23:02.924 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.924 "is_configured": false, 00:23:02.924 "data_offset": 0, 00:23:02.924 "data_size": 65536 00:23:02.924 }, 00:23:02.924 { 00:23:02.924 "name": "BaseBdev3", 00:23:02.924 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:23:02.924 "is_configured": true, 00:23:02.924 "data_offset": 0, 00:23:02.924 "data_size": 65536 00:23:02.924 }, 00:23:02.924 { 00:23:02.924 "name": "BaseBdev4", 00:23:02.924 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:23:02.924 "is_configured": true, 00:23:02.924 "data_offset": 0, 00:23:02.924 "data_size": 65536 00:23:02.924 } 00:23:02.924 ] 00:23:02.924 }' 00:23:02.924 [2024-07-15 17:35:14.129508] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=808 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.924 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.184 [2024-07-15 17:35:14.237558] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:03.184 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:03.184 "name": "raid_bdev1", 00:23:03.184 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:23:03.184 "strip_size_kb": 0, 00:23:03.184 "state": "online", 00:23:03.184 "raid_level": "raid1", 00:23:03.184 "superblock": false, 00:23:03.184 "num_base_bdevs": 4, 00:23:03.184 "num_base_bdevs_discovered": 3, 00:23:03.184 "num_base_bdevs_operational": 3, 00:23:03.184 "process": { 00:23:03.184 "type": "rebuild", 00:23:03.184 "target": "spare", 00:23:03.184 "progress": { 00:23:03.184 "blocks": 30720, 00:23:03.184 "percent": 46 00:23:03.184 } 00:23:03.184 }, 00:23:03.184 "base_bdevs_list": [ 00:23:03.184 { 00:23:03.184 "name": "spare", 00:23:03.184 "uuid": "1f871d9a-6955-5cc6-856d-2f9f28be8695", 00:23:03.184 "is_configured": true, 00:23:03.184 "data_offset": 0, 00:23:03.184 "data_size": 65536 00:23:03.184 }, 00:23:03.184 { 00:23:03.184 "name": null, 00:23:03.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:03.184 "is_configured": false, 00:23:03.184 "data_offset": 0, 00:23:03.184 "data_size": 65536 00:23:03.184 }, 00:23:03.184 { 00:23:03.184 "name": "BaseBdev3", 00:23:03.184 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:23:03.184 "is_configured": true, 00:23:03.184 "data_offset": 0, 00:23:03.184 "data_size": 65536 00:23:03.184 }, 00:23:03.184 { 00:23:03.184 "name": "BaseBdev4", 00:23:03.184 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:23:03.184 "is_configured": true, 00:23:03.184 "data_offset": 0, 00:23:03.184 "data_size": 65536 00:23:03.184 } 00:23:03.184 ] 00:23:03.184 }' 00:23:03.184 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:03.184 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:03.184 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:03.184 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:03.184 17:35:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:03.753 [2024-07-15 17:35:14.780288] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:23:03.753 [2024-07-15 17:35:14.780744] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:23:03.754 [2024-07-15 17:35:15.004492] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:04.324 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:04.324 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:04.324 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:04.324 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:04.324 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:04.324 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:04.324 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.324 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.584 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:04.584 "name": "raid_bdev1", 00:23:04.584 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:23:04.584 "strip_size_kb": 0, 00:23:04.584 "state": "online", 00:23:04.584 "raid_level": "raid1", 00:23:04.584 "superblock": false, 00:23:04.584 "num_base_bdevs": 4, 00:23:04.584 "num_base_bdevs_discovered": 3, 00:23:04.584 "num_base_bdevs_operational": 3, 00:23:04.584 "process": { 00:23:04.584 "type": "rebuild", 00:23:04.584 "target": "spare", 00:23:04.584 "progress": { 00:23:04.584 "blocks": 51200, 00:23:04.584 "percent": 78 00:23:04.584 } 00:23:04.584 }, 00:23:04.584 "base_bdevs_list": [ 00:23:04.584 { 00:23:04.584 "name": "spare", 00:23:04.584 "uuid": "1f871d9a-6955-5cc6-856d-2f9f28be8695", 00:23:04.584 "is_configured": true, 00:23:04.584 "data_offset": 0, 00:23:04.584 "data_size": 65536 00:23:04.584 }, 00:23:04.584 { 00:23:04.584 "name": null, 00:23:04.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.584 "is_configured": false, 00:23:04.584 "data_offset": 0, 00:23:04.584 "data_size": 65536 00:23:04.584 }, 00:23:04.584 { 00:23:04.584 "name": "BaseBdev3", 00:23:04.584 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:23:04.584 "is_configured": true, 00:23:04.584 "data_offset": 0, 00:23:04.584 "data_size": 65536 00:23:04.584 }, 00:23:04.584 { 00:23:04.584 "name": "BaseBdev4", 00:23:04.584 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:23:04.584 "is_configured": true, 00:23:04.584 "data_offset": 0, 00:23:04.584 "data_size": 65536 00:23:04.584 } 00:23:04.584 ] 00:23:04.584 }' 00:23:04.584 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:04.584 [2024-07-15 17:35:15.696776] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:23:04.585 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:04.585 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:04.585 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:04.585 17:35:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:04.844 [2024-07-15 17:35:16.137271] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:23:05.415 [2024-07-15 17:35:16.578014] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:05.415 [2024-07-15 17:35:16.684716] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:05.415 [2024-07-15 17:35:16.686093] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:05.675 "name": "raid_bdev1", 00:23:05.675 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:23:05.675 "strip_size_kb": 0, 00:23:05.675 "state": "online", 00:23:05.675 "raid_level": "raid1", 00:23:05.675 "superblock": false, 00:23:05.675 "num_base_bdevs": 4, 00:23:05.675 "num_base_bdevs_discovered": 3, 00:23:05.675 "num_base_bdevs_operational": 3, 00:23:05.675 "base_bdevs_list": [ 00:23:05.675 { 00:23:05.675 "name": "spare", 00:23:05.675 "uuid": "1f871d9a-6955-5cc6-856d-2f9f28be8695", 00:23:05.675 "is_configured": true, 00:23:05.675 "data_offset": 0, 00:23:05.675 "data_size": 65536 00:23:05.675 }, 00:23:05.675 { 00:23:05.675 "name": null, 00:23:05.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.675 "is_configured": false, 00:23:05.675 "data_offset": 0, 00:23:05.675 "data_size": 65536 00:23:05.675 }, 00:23:05.675 { 00:23:05.675 "name": "BaseBdev3", 00:23:05.675 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:23:05.675 "is_configured": true, 00:23:05.675 "data_offset": 0, 00:23:05.675 "data_size": 65536 00:23:05.675 }, 00:23:05.675 { 00:23:05.675 "name": "BaseBdev4", 00:23:05.675 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:23:05.675 "is_configured": true, 00:23:05.675 "data_offset": 0, 00:23:05.675 "data_size": 65536 00:23:05.675 } 00:23:05.675 ] 00:23:05.675 }' 00:23:05.675 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:05.936 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:05.936 17:35:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:05.936 "name": "raid_bdev1", 00:23:05.936 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:23:05.936 "strip_size_kb": 0, 00:23:05.936 "state": "online", 00:23:05.936 "raid_level": "raid1", 00:23:05.936 "superblock": false, 00:23:05.936 "num_base_bdevs": 4, 00:23:05.936 "num_base_bdevs_discovered": 3, 00:23:05.936 "num_base_bdevs_operational": 3, 00:23:05.936 "base_bdevs_list": [ 00:23:05.936 { 00:23:05.936 "name": "spare", 00:23:05.936 "uuid": "1f871d9a-6955-5cc6-856d-2f9f28be8695", 00:23:05.936 "is_configured": true, 00:23:05.936 "data_offset": 0, 00:23:05.936 "data_size": 65536 00:23:05.936 }, 00:23:05.936 { 00:23:05.936 "name": null, 00:23:05.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.936 "is_configured": false, 00:23:05.936 "data_offset": 0, 00:23:05.936 "data_size": 65536 00:23:05.936 }, 00:23:05.936 { 00:23:05.936 "name": "BaseBdev3", 00:23:05.936 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:23:05.936 "is_configured": true, 00:23:05.936 "data_offset": 0, 00:23:05.936 "data_size": 65536 00:23:05.936 }, 00:23:05.936 { 00:23:05.936 "name": "BaseBdev4", 00:23:05.936 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:23:05.936 "is_configured": true, 00:23:05.936 "data_offset": 0, 00:23:05.936 "data_size": 65536 00:23:05.936 } 00:23:05.936 ] 00:23:05.936 }' 00:23:05.936 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.197 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.457 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.457 "name": "raid_bdev1", 00:23:06.457 "uuid": "2ddcdb51-c213-4d8f-be80-6474d0b7c279", 00:23:06.457 "strip_size_kb": 0, 00:23:06.457 "state": "online", 00:23:06.457 "raid_level": "raid1", 00:23:06.457 "superblock": false, 00:23:06.457 "num_base_bdevs": 4, 00:23:06.457 "num_base_bdevs_discovered": 3, 00:23:06.457 "num_base_bdevs_operational": 3, 00:23:06.457 "base_bdevs_list": [ 00:23:06.457 { 00:23:06.457 "name": "spare", 00:23:06.457 "uuid": "1f871d9a-6955-5cc6-856d-2f9f28be8695", 00:23:06.457 "is_configured": true, 00:23:06.457 "data_offset": 0, 00:23:06.457 "data_size": 65536 00:23:06.457 }, 00:23:06.457 { 00:23:06.457 "name": null, 00:23:06.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.457 "is_configured": false, 00:23:06.457 "data_offset": 0, 00:23:06.457 "data_size": 65536 00:23:06.457 }, 00:23:06.457 { 00:23:06.457 "name": "BaseBdev3", 00:23:06.457 "uuid": "d49572f8-669f-5ee3-80f3-fbfab95b8a97", 00:23:06.457 "is_configured": true, 00:23:06.457 "data_offset": 0, 00:23:06.457 "data_size": 65536 00:23:06.457 }, 00:23:06.457 { 00:23:06.457 "name": "BaseBdev4", 00:23:06.458 "uuid": "0a10bc09-be47-5a76-bd45-f420d12b39b2", 00:23:06.458 "is_configured": true, 00:23:06.458 "data_offset": 0, 00:23:06.458 "data_size": 65536 00:23:06.458 } 00:23:06.458 ] 00:23:06.458 }' 00:23:06.458 17:35:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.458 17:35:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:07.028 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:07.288 [2024-07-15 17:35:18.583072] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:07.288 [2024-07-15 17:35:18.583095] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:07.548 00:23:07.548 Latency(us) 00:23:07.548 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:07.548 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:07.548 raid_bdev1 : 10.74 112.45 337.34 0.00 0.00 12373.66 244.18 109697.18 00:23:07.548 =================================================================================================================== 00:23:07.548 Total : 112.45 337.34 0.00 0.00 12373.66 244.18 109697.18 00:23:07.548 [2024-07-15 17:35:18.638414] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:07.548 [2024-07-15 17:35:18.638437] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:07.548 [2024-07-15 17:35:18.638516] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:07.548 [2024-07-15 17:35:18.638527] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2280b30 name raid_bdev1, state offline 00:23:07.548 0 00:23:07.548 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.548 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:07.809 17:35:18 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:07.809 /dev/nbd0 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:07.809 1+0 records in 00:23:07.809 1+0 records out 00:23:07.809 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274131 s, 14.9 MB/s 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:07.809 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:23:08.070 /dev/nbd1 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:08.070 1+0 records in 00:23:08.070 1+0 records out 00:23:08.070 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256354 s, 16.0 MB/s 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:08.070 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:08.331 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:23:08.591 /dev/nbd1 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:08.591 1+0 records in 00:23:08.591 1+0 records out 00:23:08.591 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253472 s, 16.2 MB/s 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:08.591 17:35:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:08.852 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2884211 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2884211 ']' 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2884211 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2884211 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2884211' 00:23:09.112 killing process with pid 2884211 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2884211 00:23:09.112 Received shutdown signal, test time was about 12.413238 seconds 00:23:09.112 00:23:09.112 Latency(us) 00:23:09.112 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:09.112 =================================================================================================================== 00:23:09.112 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:09.112 [2024-07-15 17:35:20.309520] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:09.112 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2884211 00:23:09.112 [2024-07-15 17:35:20.332485] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:09.374 00:23:09.374 real 0m16.946s 00:23:09.374 user 0m26.757s 00:23:09.374 sys 0m2.357s 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:09.374 ************************************ 00:23:09.374 END TEST raid_rebuild_test_io 00:23:09.374 ************************************ 00:23:09.374 17:35:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:09.374 17:35:20 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:23:09.374 17:35:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:09.374 17:35:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:09.374 17:35:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:09.374 ************************************ 00:23:09.374 START TEST raid_rebuild_test_sb_io 00:23:09.374 ************************************ 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2887284 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2887284 /var/tmp/spdk-raid.sock 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2887284 ']' 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:09.374 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:09.375 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:09.375 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:09.375 17:35:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:09.375 [2024-07-15 17:35:20.601044] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:23:09.375 [2024-07-15 17:35:20.601117] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2887284 ] 00:23:09.375 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:09.375 Zero copy mechanism will not be used. 00:23:09.635 [2024-07-15 17:35:20.692565] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:09.635 [2024-07-15 17:35:20.759722] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:09.635 [2024-07-15 17:35:20.800094] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:09.635 [2024-07-15 17:35:20.800116] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:10.204 17:35:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:10.204 17:35:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:23:10.204 17:35:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:10.204 17:35:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:10.465 BaseBdev1_malloc 00:23:10.465 17:35:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:10.725 [2024-07-15 17:35:21.790120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:10.725 [2024-07-15 17:35:21.790154] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:10.725 [2024-07-15 17:35:21.790167] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2041d30 00:23:10.725 [2024-07-15 17:35:21.790174] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:10.725 [2024-07-15 17:35:21.791468] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:10.725 [2024-07-15 17:35:21.791488] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:10.725 BaseBdev1 00:23:10.725 17:35:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:10.725 17:35:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:10.725 BaseBdev2_malloc 00:23:10.725 17:35:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:10.985 [2024-07-15 17:35:22.157076] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:10.985 [2024-07-15 17:35:22.157105] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:10.986 [2024-07-15 17:35:22.157115] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21f4c60 00:23:10.986 [2024-07-15 17:35:22.157121] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:10.986 [2024-07-15 17:35:22.158321] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:10.986 [2024-07-15 17:35:22.158341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:10.986 BaseBdev2 00:23:10.986 17:35:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:10.986 17:35:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:11.246 BaseBdev3_malloc 00:23:11.246 17:35:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:11.246 [2024-07-15 17:35:22.527810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:11.246 [2024-07-15 17:35:22.527835] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:11.246 [2024-07-15 17:35:22.527845] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d9b90 00:23:11.246 [2024-07-15 17:35:22.527851] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:11.246 [2024-07-15 17:35:22.529025] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:11.246 [2024-07-15 17:35:22.529043] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:11.246 BaseBdev3 00:23:11.246 17:35:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:11.246 17:35:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:11.506 BaseBdev4_malloc 00:23:11.506 17:35:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:11.766 [2024-07-15 17:35:22.910698] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:11.766 [2024-07-15 17:35:22.910728] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:11.766 [2024-07-15 17:35:22.910739] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20428c0 00:23:11.766 [2024-07-15 17:35:22.910745] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:11.766 [2024-07-15 17:35:22.911919] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:11.766 [2024-07-15 17:35:22.911937] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:11.766 BaseBdev4 00:23:11.766 17:35:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:12.028 spare_malloc 00:23:12.028 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:12.028 spare_delay 00:23:12.028 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:12.288 [2024-07-15 17:35:23.457964] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:12.288 [2024-07-15 17:35:23.457989] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:12.288 [2024-07-15 17:35:23.458000] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x203aa80 00:23:12.288 [2024-07-15 17:35:23.458006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:12.288 [2024-07-15 17:35:23.459183] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:12.288 [2024-07-15 17:35:23.459202] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:12.288 spare 00:23:12.288 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:23:12.549 [2024-07-15 17:35:23.646466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:12.549 [2024-07-15 17:35:23.647470] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:12.549 [2024-07-15 17:35:23.647511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:12.549 [2024-07-15 17:35:23.647545] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:12.549 [2024-07-15 17:35:23.647684] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x203bb30 00:23:12.549 [2024-07-15 17:35:23.647692] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:12.549 [2024-07-15 17:35:23.647848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20392e0 00:23:12.549 [2024-07-15 17:35:23.647962] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x203bb30 00:23:12.549 [2024-07-15 17:35:23.647968] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x203bb30 00:23:12.549 [2024-07-15 17:35:23.648036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.549 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.809 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:12.809 "name": "raid_bdev1", 00:23:12.809 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:12.809 "strip_size_kb": 0, 00:23:12.809 "state": "online", 00:23:12.810 "raid_level": "raid1", 00:23:12.810 "superblock": true, 00:23:12.810 "num_base_bdevs": 4, 00:23:12.810 "num_base_bdevs_discovered": 4, 00:23:12.810 "num_base_bdevs_operational": 4, 00:23:12.810 "base_bdevs_list": [ 00:23:12.810 { 00:23:12.810 "name": "BaseBdev1", 00:23:12.810 "uuid": "8dbf68ee-393a-5793-a612-690145cf213d", 00:23:12.810 "is_configured": true, 00:23:12.810 "data_offset": 2048, 00:23:12.810 "data_size": 63488 00:23:12.810 }, 00:23:12.810 { 00:23:12.810 "name": "BaseBdev2", 00:23:12.810 "uuid": "99b5116a-7f08-5d3c-8916-997ce49e4611", 00:23:12.810 "is_configured": true, 00:23:12.810 "data_offset": 2048, 00:23:12.810 "data_size": 63488 00:23:12.810 }, 00:23:12.810 { 00:23:12.810 "name": "BaseBdev3", 00:23:12.810 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:12.810 "is_configured": true, 00:23:12.810 "data_offset": 2048, 00:23:12.810 "data_size": 63488 00:23:12.810 }, 00:23:12.810 { 00:23:12.810 "name": "BaseBdev4", 00:23:12.810 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:12.810 "is_configured": true, 00:23:12.810 "data_offset": 2048, 00:23:12.810 "data_size": 63488 00:23:12.810 } 00:23:12.810 ] 00:23:12.810 }' 00:23:12.810 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:12.810 17:35:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:13.070 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:13.070 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:13.331 [2024-07-15 17:35:24.512942] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:13.331 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:13.331 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.331 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:13.592 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:13.592 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:13.592 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:13.592 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:13.592 [2024-07-15 17:35:24.822930] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20411a0 00:23:13.592 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:13.592 Zero copy mechanism will not be used. 00:23:13.592 Running I/O for 60 seconds... 00:23:13.852 [2024-07-15 17:35:24.914052] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:13.852 [2024-07-15 17:35:24.920664] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x20411a0 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.853 17:35:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.112 17:35:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:14.112 "name": "raid_bdev1", 00:23:14.112 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:14.112 "strip_size_kb": 0, 00:23:14.112 "state": "online", 00:23:14.112 "raid_level": "raid1", 00:23:14.112 "superblock": true, 00:23:14.112 "num_base_bdevs": 4, 00:23:14.112 "num_base_bdevs_discovered": 3, 00:23:14.112 "num_base_bdevs_operational": 3, 00:23:14.112 "base_bdevs_list": [ 00:23:14.112 { 00:23:14.112 "name": null, 00:23:14.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.112 "is_configured": false, 00:23:14.112 "data_offset": 2048, 00:23:14.112 "data_size": 63488 00:23:14.112 }, 00:23:14.112 { 00:23:14.112 "name": "BaseBdev2", 00:23:14.112 "uuid": "99b5116a-7f08-5d3c-8916-997ce49e4611", 00:23:14.112 "is_configured": true, 00:23:14.112 "data_offset": 2048, 00:23:14.112 "data_size": 63488 00:23:14.112 }, 00:23:14.112 { 00:23:14.112 "name": "BaseBdev3", 00:23:14.112 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:14.112 "is_configured": true, 00:23:14.112 "data_offset": 2048, 00:23:14.112 "data_size": 63488 00:23:14.112 }, 00:23:14.112 { 00:23:14.112 "name": "BaseBdev4", 00:23:14.112 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:14.112 "is_configured": true, 00:23:14.112 "data_offset": 2048, 00:23:14.112 "data_size": 63488 00:23:14.112 } 00:23:14.112 ] 00:23:14.112 }' 00:23:14.112 17:35:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:14.112 17:35:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:14.725 17:35:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:14.725 [2024-07-15 17:35:25.903174] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:14.725 17:35:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:14.725 [2024-07-15 17:35:25.953942] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d9090 00:23:14.725 [2024-07-15 17:35:25.955607] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:14.987 [2024-07-15 17:35:26.086223] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:14.987 [2024-07-15 17:35:26.086482] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:14.987 [2024-07-15 17:35:26.230356] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:14.987 [2024-07-15 17:35:26.230509] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:15.556 [2024-07-15 17:35:26.558520] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:15.556 [2024-07-15 17:35:26.559307] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:15.556 [2024-07-15 17:35:26.785645] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:15.557 [2024-07-15 17:35:26.785775] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:15.817 17:35:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:15.817 17:35:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:15.817 17:35:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:15.817 17:35:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:15.817 17:35:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:15.817 17:35:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.817 17:35:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.817 [2024-07-15 17:35:27.034321] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:16.077 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:16.077 "name": "raid_bdev1", 00:23:16.077 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:16.077 "strip_size_kb": 0, 00:23:16.077 "state": "online", 00:23:16.077 "raid_level": "raid1", 00:23:16.077 "superblock": true, 00:23:16.077 "num_base_bdevs": 4, 00:23:16.077 "num_base_bdevs_discovered": 4, 00:23:16.077 "num_base_bdevs_operational": 4, 00:23:16.077 "process": { 00:23:16.077 "type": "rebuild", 00:23:16.077 "target": "spare", 00:23:16.077 "progress": { 00:23:16.077 "blocks": 14336, 00:23:16.077 "percent": 22 00:23:16.077 } 00:23:16.077 }, 00:23:16.077 "base_bdevs_list": [ 00:23:16.077 { 00:23:16.077 "name": "spare", 00:23:16.077 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:16.077 "is_configured": true, 00:23:16.077 "data_offset": 2048, 00:23:16.077 "data_size": 63488 00:23:16.077 }, 00:23:16.077 { 00:23:16.077 "name": "BaseBdev2", 00:23:16.077 "uuid": "99b5116a-7f08-5d3c-8916-997ce49e4611", 00:23:16.077 "is_configured": true, 00:23:16.077 "data_offset": 2048, 00:23:16.077 "data_size": 63488 00:23:16.077 }, 00:23:16.077 { 00:23:16.077 "name": "BaseBdev3", 00:23:16.077 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:16.077 "is_configured": true, 00:23:16.077 "data_offset": 2048, 00:23:16.077 "data_size": 63488 00:23:16.077 }, 00:23:16.077 { 00:23:16.077 "name": "BaseBdev4", 00:23:16.077 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:16.077 "is_configured": true, 00:23:16.077 "data_offset": 2048, 00:23:16.077 "data_size": 63488 00:23:16.077 } 00:23:16.077 ] 00:23:16.077 }' 00:23:16.077 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:16.077 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:16.077 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.077 [2024-07-15 17:35:27.252001] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:16.077 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:16.077 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:16.337 [2024-07-15 17:35:27.438470] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:16.337 [2024-07-15 17:35:27.564343] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:16.337 [2024-07-15 17:35:27.565636] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:16.337 [2024-07-15 17:35:27.565657] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:16.337 [2024-07-15 17:35:27.565662] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:16.337 [2024-07-15 17:35:27.582877] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x20411a0 00:23:16.337 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:16.337 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:16.337 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:16.337 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:16.337 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:16.337 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:16.337 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:16.338 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:16.338 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:16.338 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:16.338 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.338 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.598 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:16.598 "name": "raid_bdev1", 00:23:16.598 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:16.598 "strip_size_kb": 0, 00:23:16.598 "state": "online", 00:23:16.598 "raid_level": "raid1", 00:23:16.598 "superblock": true, 00:23:16.598 "num_base_bdevs": 4, 00:23:16.598 "num_base_bdevs_discovered": 3, 00:23:16.598 "num_base_bdevs_operational": 3, 00:23:16.598 "base_bdevs_list": [ 00:23:16.598 { 00:23:16.598 "name": null, 00:23:16.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.598 "is_configured": false, 00:23:16.598 "data_offset": 2048, 00:23:16.598 "data_size": 63488 00:23:16.598 }, 00:23:16.598 { 00:23:16.598 "name": "BaseBdev2", 00:23:16.598 "uuid": "99b5116a-7f08-5d3c-8916-997ce49e4611", 00:23:16.598 "is_configured": true, 00:23:16.598 "data_offset": 2048, 00:23:16.598 "data_size": 63488 00:23:16.598 }, 00:23:16.598 { 00:23:16.598 "name": "BaseBdev3", 00:23:16.598 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:16.598 "is_configured": true, 00:23:16.598 "data_offset": 2048, 00:23:16.598 "data_size": 63488 00:23:16.598 }, 00:23:16.598 { 00:23:16.598 "name": "BaseBdev4", 00:23:16.598 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:16.598 "is_configured": true, 00:23:16.598 "data_offset": 2048, 00:23:16.598 "data_size": 63488 00:23:16.598 } 00:23:16.598 ] 00:23:16.598 }' 00:23:16.598 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:16.598 17:35:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:17.168 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:17.168 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.168 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:17.168 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:17.168 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.168 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.168 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.429 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.429 "name": "raid_bdev1", 00:23:17.429 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:17.429 "strip_size_kb": 0, 00:23:17.429 "state": "online", 00:23:17.429 "raid_level": "raid1", 00:23:17.429 "superblock": true, 00:23:17.429 "num_base_bdevs": 4, 00:23:17.429 "num_base_bdevs_discovered": 3, 00:23:17.429 "num_base_bdevs_operational": 3, 00:23:17.429 "base_bdevs_list": [ 00:23:17.429 { 00:23:17.429 "name": null, 00:23:17.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.429 "is_configured": false, 00:23:17.429 "data_offset": 2048, 00:23:17.429 "data_size": 63488 00:23:17.429 }, 00:23:17.429 { 00:23:17.429 "name": "BaseBdev2", 00:23:17.429 "uuid": "99b5116a-7f08-5d3c-8916-997ce49e4611", 00:23:17.429 "is_configured": true, 00:23:17.429 "data_offset": 2048, 00:23:17.429 "data_size": 63488 00:23:17.429 }, 00:23:17.429 { 00:23:17.429 "name": "BaseBdev3", 00:23:17.429 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:17.429 "is_configured": true, 00:23:17.429 "data_offset": 2048, 00:23:17.429 "data_size": 63488 00:23:17.429 }, 00:23:17.429 { 00:23:17.429 "name": "BaseBdev4", 00:23:17.429 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:17.429 "is_configured": true, 00:23:17.429 "data_offset": 2048, 00:23:17.429 "data_size": 63488 00:23:17.429 } 00:23:17.429 ] 00:23:17.429 }' 00:23:17.429 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.429 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:17.429 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.429 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:17.429 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:17.689 [2024-07-15 17:35:28.857240] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:17.689 17:35:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:17.689 [2024-07-15 17:35:28.907320] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d8f30 00:23:17.689 [2024-07-15 17:35:28.908488] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:17.949 [2024-07-15 17:35:29.015956] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:17.949 [2024-07-15 17:35:29.016555] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:18.210 [2024-07-15 17:35:29.248443] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:18.210 [2024-07-15 17:35:29.248557] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:18.210 [2024-07-15 17:35:29.490063] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:18.210 [2024-07-15 17:35:29.490320] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:18.781 [2024-07-15 17:35:29.858147] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:18.781 17:35:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:18.781 17:35:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:18.781 17:35:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:18.781 17:35:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:18.781 17:35:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:18.781 17:35:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.781 17:35:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.041 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:19.041 "name": "raid_bdev1", 00:23:19.041 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:19.041 "strip_size_kb": 0, 00:23:19.041 "state": "online", 00:23:19.041 "raid_level": "raid1", 00:23:19.041 "superblock": true, 00:23:19.041 "num_base_bdevs": 4, 00:23:19.041 "num_base_bdevs_discovered": 4, 00:23:19.041 "num_base_bdevs_operational": 4, 00:23:19.041 "process": { 00:23:19.041 "type": "rebuild", 00:23:19.041 "target": "spare", 00:23:19.041 "progress": { 00:23:19.041 "blocks": 18432, 00:23:19.041 "percent": 29 00:23:19.041 } 00:23:19.041 }, 00:23:19.041 "base_bdevs_list": [ 00:23:19.041 { 00:23:19.041 "name": "spare", 00:23:19.041 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:19.041 "is_configured": true, 00:23:19.041 "data_offset": 2048, 00:23:19.041 "data_size": 63488 00:23:19.041 }, 00:23:19.041 { 00:23:19.042 "name": "BaseBdev2", 00:23:19.042 "uuid": "99b5116a-7f08-5d3c-8916-997ce49e4611", 00:23:19.042 "is_configured": true, 00:23:19.042 "data_offset": 2048, 00:23:19.042 "data_size": 63488 00:23:19.042 }, 00:23:19.042 { 00:23:19.042 "name": "BaseBdev3", 00:23:19.042 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:19.042 "is_configured": true, 00:23:19.042 "data_offset": 2048, 00:23:19.042 "data_size": 63488 00:23:19.042 }, 00:23:19.042 { 00:23:19.042 "name": "BaseBdev4", 00:23:19.042 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:19.042 "is_configured": true, 00:23:19.042 "data_offset": 2048, 00:23:19.042 "data_size": 63488 00:23:19.042 } 00:23:19.042 ] 00:23:19.042 }' 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:19.042 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:19.042 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:19.302 [2024-07-15 17:35:30.374594] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:19.563 [2024-07-15 17:35:30.649060] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x20411a0 00:23:19.563 [2024-07-15 17:35:30.649081] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x20d8f30 00:23:19.563 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:19.563 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:19.563 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:19.563 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.563 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:19.563 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:19.563 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.563 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.563 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.563 [2024-07-15 17:35:30.773403] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:19.824 "name": "raid_bdev1", 00:23:19.824 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:19.824 "strip_size_kb": 0, 00:23:19.824 "state": "online", 00:23:19.824 "raid_level": "raid1", 00:23:19.824 "superblock": true, 00:23:19.824 "num_base_bdevs": 4, 00:23:19.824 "num_base_bdevs_discovered": 3, 00:23:19.824 "num_base_bdevs_operational": 3, 00:23:19.824 "process": { 00:23:19.824 "type": "rebuild", 00:23:19.824 "target": "spare", 00:23:19.824 "progress": { 00:23:19.824 "blocks": 26624, 00:23:19.824 "percent": 41 00:23:19.824 } 00:23:19.824 }, 00:23:19.824 "base_bdevs_list": [ 00:23:19.824 { 00:23:19.824 "name": "spare", 00:23:19.824 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:19.824 "is_configured": true, 00:23:19.824 "data_offset": 2048, 00:23:19.824 "data_size": 63488 00:23:19.824 }, 00:23:19.824 { 00:23:19.824 "name": null, 00:23:19.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.824 "is_configured": false, 00:23:19.824 "data_offset": 2048, 00:23:19.824 "data_size": 63488 00:23:19.824 }, 00:23:19.824 { 00:23:19.824 "name": "BaseBdev3", 00:23:19.824 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:19.824 "is_configured": true, 00:23:19.824 "data_offset": 2048, 00:23:19.824 "data_size": 63488 00:23:19.824 }, 00:23:19.824 { 00:23:19.824 "name": "BaseBdev4", 00:23:19.824 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:19.824 "is_configured": true, 00:23:19.824 "data_offset": 2048, 00:23:19.824 "data_size": 63488 00:23:19.824 } 00:23:19.824 ] 00:23:19.824 }' 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:19.824 [2024-07-15 17:35:30.882382] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=824 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.824 17:35:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.085 17:35:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.085 "name": "raid_bdev1", 00:23:20.085 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:20.085 "strip_size_kb": 0, 00:23:20.085 "state": "online", 00:23:20.085 "raid_level": "raid1", 00:23:20.085 "superblock": true, 00:23:20.085 "num_base_bdevs": 4, 00:23:20.085 "num_base_bdevs_discovered": 3, 00:23:20.085 "num_base_bdevs_operational": 3, 00:23:20.085 "process": { 00:23:20.085 "type": "rebuild", 00:23:20.085 "target": "spare", 00:23:20.085 "progress": { 00:23:20.085 "blocks": 30720, 00:23:20.085 "percent": 48 00:23:20.085 } 00:23:20.085 }, 00:23:20.085 "base_bdevs_list": [ 00:23:20.085 { 00:23:20.085 "name": "spare", 00:23:20.085 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:20.085 "is_configured": true, 00:23:20.085 "data_offset": 2048, 00:23:20.085 "data_size": 63488 00:23:20.085 }, 00:23:20.085 { 00:23:20.085 "name": null, 00:23:20.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:20.085 "is_configured": false, 00:23:20.085 "data_offset": 2048, 00:23:20.085 "data_size": 63488 00:23:20.085 }, 00:23:20.085 { 00:23:20.085 "name": "BaseBdev3", 00:23:20.085 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:20.085 "is_configured": true, 00:23:20.085 "data_offset": 2048, 00:23:20.085 "data_size": 63488 00:23:20.085 }, 00:23:20.085 { 00:23:20.085 "name": "BaseBdev4", 00:23:20.085 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:20.085 "is_configured": true, 00:23:20.085 "data_offset": 2048, 00:23:20.085 "data_size": 63488 00:23:20.086 } 00:23:20.086 ] 00:23:20.086 }' 00:23:20.086 17:35:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.086 17:35:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:20.086 17:35:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.086 17:35:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:20.086 17:35:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:20.346 [2024-07-15 17:35:31.629252] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:20.919 [2024-07-15 17:35:31.967258] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:21.179 "name": "raid_bdev1", 00:23:21.179 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:21.179 "strip_size_kb": 0, 00:23:21.179 "state": "online", 00:23:21.179 "raid_level": "raid1", 00:23:21.179 "superblock": true, 00:23:21.179 "num_base_bdevs": 4, 00:23:21.179 "num_base_bdevs_discovered": 3, 00:23:21.179 "num_base_bdevs_operational": 3, 00:23:21.179 "process": { 00:23:21.179 "type": "rebuild", 00:23:21.179 "target": "spare", 00:23:21.179 "progress": { 00:23:21.179 "blocks": 53248, 00:23:21.179 "percent": 83 00:23:21.179 } 00:23:21.179 }, 00:23:21.179 "base_bdevs_list": [ 00:23:21.179 { 00:23:21.179 "name": "spare", 00:23:21.179 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:21.179 "is_configured": true, 00:23:21.179 "data_offset": 2048, 00:23:21.179 "data_size": 63488 00:23:21.179 }, 00:23:21.179 { 00:23:21.179 "name": null, 00:23:21.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:21.179 "is_configured": false, 00:23:21.179 "data_offset": 2048, 00:23:21.179 "data_size": 63488 00:23:21.179 }, 00:23:21.179 { 00:23:21.179 "name": "BaseBdev3", 00:23:21.179 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:21.179 "is_configured": true, 00:23:21.179 "data_offset": 2048, 00:23:21.179 "data_size": 63488 00:23:21.179 }, 00:23:21.179 { 00:23:21.179 "name": "BaseBdev4", 00:23:21.179 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:21.179 "is_configured": true, 00:23:21.179 "data_offset": 2048, 00:23:21.179 "data_size": 63488 00:23:21.179 } 00:23:21.179 ] 00:23:21.179 }' 00:23:21.179 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:21.440 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:21.440 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:21.440 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:21.440 17:35:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:21.440 [2024-07-15 17:35:32.624565] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:23:21.700 [2024-07-15 17:35:32.956645] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:21.961 [2024-07-15 17:35:33.056903] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:21.961 [2024-07-15 17:35:33.065284] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:22.531 "name": "raid_bdev1", 00:23:22.531 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:22.531 "strip_size_kb": 0, 00:23:22.531 "state": "online", 00:23:22.531 "raid_level": "raid1", 00:23:22.531 "superblock": true, 00:23:22.531 "num_base_bdevs": 4, 00:23:22.531 "num_base_bdevs_discovered": 3, 00:23:22.531 "num_base_bdevs_operational": 3, 00:23:22.531 "base_bdevs_list": [ 00:23:22.531 { 00:23:22.531 "name": "spare", 00:23:22.531 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:22.531 "is_configured": true, 00:23:22.531 "data_offset": 2048, 00:23:22.531 "data_size": 63488 00:23:22.531 }, 00:23:22.531 { 00:23:22.531 "name": null, 00:23:22.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.531 "is_configured": false, 00:23:22.531 "data_offset": 2048, 00:23:22.531 "data_size": 63488 00:23:22.531 }, 00:23:22.531 { 00:23:22.531 "name": "BaseBdev3", 00:23:22.531 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:22.531 "is_configured": true, 00:23:22.531 "data_offset": 2048, 00:23:22.531 "data_size": 63488 00:23:22.531 }, 00:23:22.531 { 00:23:22.531 "name": "BaseBdev4", 00:23:22.531 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:22.531 "is_configured": true, 00:23:22.531 "data_offset": 2048, 00:23:22.531 "data_size": 63488 00:23:22.531 } 00:23:22.531 ] 00:23:22.531 }' 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:22.531 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:22.793 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:22.793 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:23:22.793 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:22.793 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:22.793 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:22.793 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:22.793 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:22.793 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.793 17:35:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:22.793 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:22.793 "name": "raid_bdev1", 00:23:22.793 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:22.793 "strip_size_kb": 0, 00:23:22.793 "state": "online", 00:23:22.793 "raid_level": "raid1", 00:23:22.793 "superblock": true, 00:23:22.793 "num_base_bdevs": 4, 00:23:22.793 "num_base_bdevs_discovered": 3, 00:23:22.793 "num_base_bdevs_operational": 3, 00:23:22.793 "base_bdevs_list": [ 00:23:22.793 { 00:23:22.793 "name": "spare", 00:23:22.793 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:22.793 "is_configured": true, 00:23:22.793 "data_offset": 2048, 00:23:22.793 "data_size": 63488 00:23:22.793 }, 00:23:22.793 { 00:23:22.793 "name": null, 00:23:22.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.793 "is_configured": false, 00:23:22.793 "data_offset": 2048, 00:23:22.793 "data_size": 63488 00:23:22.793 }, 00:23:22.793 { 00:23:22.793 "name": "BaseBdev3", 00:23:22.793 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:22.793 "is_configured": true, 00:23:22.793 "data_offset": 2048, 00:23:22.793 "data_size": 63488 00:23:22.793 }, 00:23:22.793 { 00:23:22.793 "name": "BaseBdev4", 00:23:22.793 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:22.793 "is_configured": true, 00:23:22.793 "data_offset": 2048, 00:23:22.793 "data_size": 63488 00:23:22.793 } 00:23:22.793 ] 00:23:22.793 }' 00:23:22.793 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:22.793 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:22.793 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:23.054 "name": "raid_bdev1", 00:23:23.054 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:23.054 "strip_size_kb": 0, 00:23:23.054 "state": "online", 00:23:23.054 "raid_level": "raid1", 00:23:23.054 "superblock": true, 00:23:23.054 "num_base_bdevs": 4, 00:23:23.054 "num_base_bdevs_discovered": 3, 00:23:23.054 "num_base_bdevs_operational": 3, 00:23:23.054 "base_bdevs_list": [ 00:23:23.054 { 00:23:23.054 "name": "spare", 00:23:23.054 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:23.054 "is_configured": true, 00:23:23.054 "data_offset": 2048, 00:23:23.054 "data_size": 63488 00:23:23.054 }, 00:23:23.054 { 00:23:23.054 "name": null, 00:23:23.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:23.054 "is_configured": false, 00:23:23.054 "data_offset": 2048, 00:23:23.054 "data_size": 63488 00:23:23.054 }, 00:23:23.054 { 00:23:23.054 "name": "BaseBdev3", 00:23:23.054 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:23.054 "is_configured": true, 00:23:23.054 "data_offset": 2048, 00:23:23.054 "data_size": 63488 00:23:23.054 }, 00:23:23.054 { 00:23:23.054 "name": "BaseBdev4", 00:23:23.054 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:23.054 "is_configured": true, 00:23:23.054 "data_offset": 2048, 00:23:23.054 "data_size": 63488 00:23:23.054 } 00:23:23.054 ] 00:23:23.054 }' 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:23.054 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:23.623 17:35:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:23.884 [2024-07-15 17:35:35.016483] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:23.884 [2024-07-15 17:35:35.016503] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:23.884 00:23:23.884 Latency(us) 00:23:23.884 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:23.884 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:23.884 raid_bdev1 : 10.27 111.22 333.66 0.00 0.00 11884.80 253.64 114536.76 00:23:23.885 =================================================================================================================== 00:23:23.885 Total : 111.22 333.66 0.00 0.00 11884.80 253.64 114536.76 00:23:23.885 [2024-07-15 17:35:35.119955] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:23.885 [2024-07-15 17:35:35.119978] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:23.885 [2024-07-15 17:35:35.120056] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:23.885 [2024-07-15 17:35:35.120063] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x203bb30 name raid_bdev1, state offline 00:23:23.885 0 00:23:23.885 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.885 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:24.455 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:24.715 /dev/nbd0 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:24.716 1+0 records in 00:23:24.716 1+0 records out 00:23:24.716 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271509 s, 15.1 MB/s 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:24.716 17:35:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:23:24.977 /dev/nbd1 00:23:24.977 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:24.977 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:24.977 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:24.977 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:24.977 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:24.977 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:24.977 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:24.977 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:24.978 1+0 records in 00:23:24.978 1+0 records out 00:23:24.978 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253913 s, 16.1 MB/s 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:24.978 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:25.238 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:25.238 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:25.238 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:25.238 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:25.238 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:25.238 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:25.238 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:25.238 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:25.238 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:25.239 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:23:25.499 /dev/nbd1 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:25.499 1+0 records in 00:23:25.499 1+0 records out 00:23:25.499 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243108 s, 16.8 MB/s 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:25.499 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:25.760 17:35:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:25.760 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:26.331 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:26.591 [2024-07-15 17:35:37.773820] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:26.591 [2024-07-15 17:35:37.773853] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.591 [2024-07-15 17:35:37.773866] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x203c3b0 00:23:26.591 [2024-07-15 17:35:37.773873] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.591 [2024-07-15 17:35:37.775172] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.591 [2024-07-15 17:35:37.775195] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:26.591 [2024-07-15 17:35:37.775253] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:26.591 [2024-07-15 17:35:37.775275] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:26.591 [2024-07-15 17:35:37.775354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:26.591 [2024-07-15 17:35:37.775409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:26.591 spare 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.591 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.591 [2024-07-15 17:35:37.875702] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2040190 00:23:26.591 [2024-07-15 17:35:37.875715] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:26.591 [2024-07-15 17:35:37.875867] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x203f080 00:23:26.591 [2024-07-15 17:35:37.875980] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2040190 00:23:26.591 [2024-07-15 17:35:37.875986] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2040190 00:23:26.591 [2024-07-15 17:35:37.876067] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:26.851 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.851 "name": "raid_bdev1", 00:23:26.851 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:26.851 "strip_size_kb": 0, 00:23:26.851 "state": "online", 00:23:26.851 "raid_level": "raid1", 00:23:26.851 "superblock": true, 00:23:26.851 "num_base_bdevs": 4, 00:23:26.851 "num_base_bdevs_discovered": 3, 00:23:26.851 "num_base_bdevs_operational": 3, 00:23:26.851 "base_bdevs_list": [ 00:23:26.851 { 00:23:26.851 "name": "spare", 00:23:26.851 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:26.851 "is_configured": true, 00:23:26.851 "data_offset": 2048, 00:23:26.851 "data_size": 63488 00:23:26.851 }, 00:23:26.851 { 00:23:26.851 "name": null, 00:23:26.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:26.851 "is_configured": false, 00:23:26.851 "data_offset": 2048, 00:23:26.852 "data_size": 63488 00:23:26.852 }, 00:23:26.852 { 00:23:26.852 "name": "BaseBdev3", 00:23:26.852 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:26.852 "is_configured": true, 00:23:26.852 "data_offset": 2048, 00:23:26.852 "data_size": 63488 00:23:26.852 }, 00:23:26.852 { 00:23:26.852 "name": "BaseBdev4", 00:23:26.852 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:26.852 "is_configured": true, 00:23:26.852 "data_offset": 2048, 00:23:26.852 "data_size": 63488 00:23:26.852 } 00:23:26.852 ] 00:23:26.852 }' 00:23:26.852 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.852 17:35:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:27.422 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:27.422 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:27.422 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:27.422 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:27.422 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:27.422 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.422 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.422 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:27.422 "name": "raid_bdev1", 00:23:27.422 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:27.422 "strip_size_kb": 0, 00:23:27.422 "state": "online", 00:23:27.422 "raid_level": "raid1", 00:23:27.422 "superblock": true, 00:23:27.422 "num_base_bdevs": 4, 00:23:27.422 "num_base_bdevs_discovered": 3, 00:23:27.422 "num_base_bdevs_operational": 3, 00:23:27.422 "base_bdevs_list": [ 00:23:27.422 { 00:23:27.422 "name": "spare", 00:23:27.422 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:27.422 "is_configured": true, 00:23:27.422 "data_offset": 2048, 00:23:27.422 "data_size": 63488 00:23:27.422 }, 00:23:27.422 { 00:23:27.422 "name": null, 00:23:27.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.422 "is_configured": false, 00:23:27.422 "data_offset": 2048, 00:23:27.422 "data_size": 63488 00:23:27.422 }, 00:23:27.422 { 00:23:27.422 "name": "BaseBdev3", 00:23:27.422 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:27.422 "is_configured": true, 00:23:27.422 "data_offset": 2048, 00:23:27.422 "data_size": 63488 00:23:27.422 }, 00:23:27.422 { 00:23:27.422 "name": "BaseBdev4", 00:23:27.422 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:27.422 "is_configured": true, 00:23:27.422 "data_offset": 2048, 00:23:27.422 "data_size": 63488 00:23:27.422 } 00:23:27.422 ] 00:23:27.422 }' 00:23:27.422 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:27.683 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:27.683 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:27.683 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:27.683 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.683 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:27.944 17:35:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:27.944 [2024-07-15 17:35:39.177624] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.944 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.204 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:28.204 "name": "raid_bdev1", 00:23:28.204 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:28.204 "strip_size_kb": 0, 00:23:28.204 "state": "online", 00:23:28.204 "raid_level": "raid1", 00:23:28.204 "superblock": true, 00:23:28.204 "num_base_bdevs": 4, 00:23:28.204 "num_base_bdevs_discovered": 2, 00:23:28.204 "num_base_bdevs_operational": 2, 00:23:28.204 "base_bdevs_list": [ 00:23:28.204 { 00:23:28.204 "name": null, 00:23:28.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:28.204 "is_configured": false, 00:23:28.204 "data_offset": 2048, 00:23:28.204 "data_size": 63488 00:23:28.204 }, 00:23:28.204 { 00:23:28.204 "name": null, 00:23:28.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:28.204 "is_configured": false, 00:23:28.204 "data_offset": 2048, 00:23:28.204 "data_size": 63488 00:23:28.204 }, 00:23:28.204 { 00:23:28.204 "name": "BaseBdev3", 00:23:28.204 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:28.204 "is_configured": true, 00:23:28.204 "data_offset": 2048, 00:23:28.204 "data_size": 63488 00:23:28.204 }, 00:23:28.204 { 00:23:28.204 "name": "BaseBdev4", 00:23:28.204 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:28.204 "is_configured": true, 00:23:28.204 "data_offset": 2048, 00:23:28.204 "data_size": 63488 00:23:28.204 } 00:23:28.204 ] 00:23:28.204 }' 00:23:28.204 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:28.204 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:28.774 17:35:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:29.036 [2024-07-15 17:35:40.112111] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:29.036 [2024-07-15 17:35:40.112231] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:29.036 [2024-07-15 17:35:40.112241] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:29.036 [2024-07-15 17:35:40.112261] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:29.036 [2024-07-15 17:35:40.115197] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21dabf0 00:23:29.036 [2024-07-15 17:35:40.116769] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:29.036 17:35:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:30.012 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:30.012 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:30.012 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:30.012 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:30.012 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:30.012 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.012 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.272 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:30.272 "name": "raid_bdev1", 00:23:30.272 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:30.272 "strip_size_kb": 0, 00:23:30.272 "state": "online", 00:23:30.272 "raid_level": "raid1", 00:23:30.272 "superblock": true, 00:23:30.272 "num_base_bdevs": 4, 00:23:30.272 "num_base_bdevs_discovered": 3, 00:23:30.272 "num_base_bdevs_operational": 3, 00:23:30.272 "process": { 00:23:30.272 "type": "rebuild", 00:23:30.272 "target": "spare", 00:23:30.272 "progress": { 00:23:30.272 "blocks": 22528, 00:23:30.272 "percent": 35 00:23:30.272 } 00:23:30.272 }, 00:23:30.272 "base_bdevs_list": [ 00:23:30.272 { 00:23:30.272 "name": "spare", 00:23:30.272 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:30.272 "is_configured": true, 00:23:30.272 "data_offset": 2048, 00:23:30.272 "data_size": 63488 00:23:30.272 }, 00:23:30.272 { 00:23:30.272 "name": null, 00:23:30.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.272 "is_configured": false, 00:23:30.272 "data_offset": 2048, 00:23:30.272 "data_size": 63488 00:23:30.272 }, 00:23:30.272 { 00:23:30.272 "name": "BaseBdev3", 00:23:30.272 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:30.272 "is_configured": true, 00:23:30.272 "data_offset": 2048, 00:23:30.272 "data_size": 63488 00:23:30.272 }, 00:23:30.272 { 00:23:30.272 "name": "BaseBdev4", 00:23:30.272 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:30.272 "is_configured": true, 00:23:30.272 "data_offset": 2048, 00:23:30.272 "data_size": 63488 00:23:30.272 } 00:23:30.272 ] 00:23:30.272 }' 00:23:30.272 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:30.272 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:30.273 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:30.273 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:30.273 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:30.532 [2024-07-15 17:35:41.597727] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:30.532 [2024-07-15 17:35:41.625759] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:30.532 [2024-07-15 17:35:41.625791] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.532 [2024-07-15 17:35:41.625802] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:30.532 [2024-07-15 17:35:41.625806] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.532 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.792 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.792 "name": "raid_bdev1", 00:23:30.792 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:30.792 "strip_size_kb": 0, 00:23:30.792 "state": "online", 00:23:30.792 "raid_level": "raid1", 00:23:30.792 "superblock": true, 00:23:30.792 "num_base_bdevs": 4, 00:23:30.792 "num_base_bdevs_discovered": 2, 00:23:30.792 "num_base_bdevs_operational": 2, 00:23:30.792 "base_bdevs_list": [ 00:23:30.792 { 00:23:30.792 "name": null, 00:23:30.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.793 "is_configured": false, 00:23:30.793 "data_offset": 2048, 00:23:30.793 "data_size": 63488 00:23:30.793 }, 00:23:30.793 { 00:23:30.793 "name": null, 00:23:30.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.793 "is_configured": false, 00:23:30.793 "data_offset": 2048, 00:23:30.793 "data_size": 63488 00:23:30.793 }, 00:23:30.793 { 00:23:30.793 "name": "BaseBdev3", 00:23:30.793 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:30.793 "is_configured": true, 00:23:30.793 "data_offset": 2048, 00:23:30.793 "data_size": 63488 00:23:30.793 }, 00:23:30.793 { 00:23:30.793 "name": "BaseBdev4", 00:23:30.793 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:30.793 "is_configured": true, 00:23:30.793 "data_offset": 2048, 00:23:30.793 "data_size": 63488 00:23:30.793 } 00:23:30.793 ] 00:23:30.793 }' 00:23:30.793 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.793 17:35:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:31.361 17:35:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:31.361 [2024-07-15 17:35:42.548193] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:31.361 [2024-07-15 17:35:42.548226] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:31.361 [2024-07-15 17:35:42.548241] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2058170 00:23:31.361 [2024-07-15 17:35:42.548248] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:31.361 [2024-07-15 17:35:42.548547] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:31.361 [2024-07-15 17:35:42.548560] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:31.361 [2024-07-15 17:35:42.548619] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:31.361 [2024-07-15 17:35:42.548626] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:31.361 [2024-07-15 17:35:42.548632] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:31.361 [2024-07-15 17:35:42.548645] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:31.361 [2024-07-15 17:35:42.551610] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20424f0 00:23:31.361 [2024-07-15 17:35:42.552768] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:31.361 spare 00:23:31.361 17:35:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:32.298 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:32.298 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:32.298 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:32.298 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:32.298 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:32.298 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.298 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.558 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:32.558 "name": "raid_bdev1", 00:23:32.558 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:32.558 "strip_size_kb": 0, 00:23:32.558 "state": "online", 00:23:32.558 "raid_level": "raid1", 00:23:32.558 "superblock": true, 00:23:32.558 "num_base_bdevs": 4, 00:23:32.558 "num_base_bdevs_discovered": 3, 00:23:32.558 "num_base_bdevs_operational": 3, 00:23:32.558 "process": { 00:23:32.558 "type": "rebuild", 00:23:32.558 "target": "spare", 00:23:32.558 "progress": { 00:23:32.558 "blocks": 22528, 00:23:32.558 "percent": 35 00:23:32.558 } 00:23:32.558 }, 00:23:32.558 "base_bdevs_list": [ 00:23:32.558 { 00:23:32.558 "name": "spare", 00:23:32.558 "uuid": "7736813e-5b92-5aa6-b409-614f993df0fe", 00:23:32.558 "is_configured": true, 00:23:32.558 "data_offset": 2048, 00:23:32.558 "data_size": 63488 00:23:32.558 }, 00:23:32.558 { 00:23:32.558 "name": null, 00:23:32.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.558 "is_configured": false, 00:23:32.558 "data_offset": 2048, 00:23:32.558 "data_size": 63488 00:23:32.558 }, 00:23:32.558 { 00:23:32.558 "name": "BaseBdev3", 00:23:32.558 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:32.558 "is_configured": true, 00:23:32.558 "data_offset": 2048, 00:23:32.558 "data_size": 63488 00:23:32.558 }, 00:23:32.558 { 00:23:32.558 "name": "BaseBdev4", 00:23:32.558 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:32.558 "is_configured": true, 00:23:32.558 "data_offset": 2048, 00:23:32.558 "data_size": 63488 00:23:32.558 } 00:23:32.558 ] 00:23:32.558 }' 00:23:32.558 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:32.558 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:32.558 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:32.558 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:32.558 17:35:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:32.817 [2024-07-15 17:35:44.009584] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:32.817 [2024-07-15 17:35:44.061681] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:32.817 [2024-07-15 17:35:44.061718] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:32.817 [2024-07-15 17:35:44.061728] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:32.817 [2024-07-15 17:35:44.061733] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.817 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.077 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:33.077 "name": "raid_bdev1", 00:23:33.077 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:33.077 "strip_size_kb": 0, 00:23:33.077 "state": "online", 00:23:33.077 "raid_level": "raid1", 00:23:33.077 "superblock": true, 00:23:33.077 "num_base_bdevs": 4, 00:23:33.077 "num_base_bdevs_discovered": 2, 00:23:33.077 "num_base_bdevs_operational": 2, 00:23:33.077 "base_bdevs_list": [ 00:23:33.077 { 00:23:33.077 "name": null, 00:23:33.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.077 "is_configured": false, 00:23:33.077 "data_offset": 2048, 00:23:33.077 "data_size": 63488 00:23:33.077 }, 00:23:33.077 { 00:23:33.077 "name": null, 00:23:33.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.077 "is_configured": false, 00:23:33.077 "data_offset": 2048, 00:23:33.077 "data_size": 63488 00:23:33.077 }, 00:23:33.077 { 00:23:33.077 "name": "BaseBdev3", 00:23:33.077 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:33.077 "is_configured": true, 00:23:33.077 "data_offset": 2048, 00:23:33.077 "data_size": 63488 00:23:33.077 }, 00:23:33.077 { 00:23:33.077 "name": "BaseBdev4", 00:23:33.077 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:33.077 "is_configured": true, 00:23:33.077 "data_offset": 2048, 00:23:33.077 "data_size": 63488 00:23:33.077 } 00:23:33.077 ] 00:23:33.077 }' 00:23:33.077 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:33.077 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:33.645 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:33.645 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:33.645 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:33.645 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:33.645 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:33.645 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.645 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.928 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.928 "name": "raid_bdev1", 00:23:33.928 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:33.928 "strip_size_kb": 0, 00:23:33.928 "state": "online", 00:23:33.928 "raid_level": "raid1", 00:23:33.928 "superblock": true, 00:23:33.928 "num_base_bdevs": 4, 00:23:33.928 "num_base_bdevs_discovered": 2, 00:23:33.928 "num_base_bdevs_operational": 2, 00:23:33.928 "base_bdevs_list": [ 00:23:33.928 { 00:23:33.928 "name": null, 00:23:33.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.928 "is_configured": false, 00:23:33.928 "data_offset": 2048, 00:23:33.928 "data_size": 63488 00:23:33.928 }, 00:23:33.928 { 00:23:33.928 "name": null, 00:23:33.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.928 "is_configured": false, 00:23:33.928 "data_offset": 2048, 00:23:33.928 "data_size": 63488 00:23:33.928 }, 00:23:33.928 { 00:23:33.928 "name": "BaseBdev3", 00:23:33.928 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:33.928 "is_configured": true, 00:23:33.928 "data_offset": 2048, 00:23:33.928 "data_size": 63488 00:23:33.928 }, 00:23:33.928 { 00:23:33.928 "name": "BaseBdev4", 00:23:33.928 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:33.928 "is_configured": true, 00:23:33.928 "data_offset": 2048, 00:23:33.928 "data_size": 63488 00:23:33.928 } 00:23:33.928 ] 00:23:33.928 }' 00:23:33.928 17:35:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.928 17:35:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:33.928 17:35:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.928 17:35:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:33.928 17:35:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:33.928 17:35:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:34.188 [2024-07-15 17:35:45.353109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:34.188 [2024-07-15 17:35:45.353140] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:34.188 [2024-07-15 17:35:45.353152] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2041f60 00:23:34.188 [2024-07-15 17:35:45.353159] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:34.188 [2024-07-15 17:35:45.353423] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:34.188 [2024-07-15 17:35:45.353434] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:34.188 [2024-07-15 17:35:45.353479] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:34.188 [2024-07-15 17:35:45.353491] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:23:34.189 [2024-07-15 17:35:45.353497] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:34.189 BaseBdev1 00:23:34.189 17:35:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:35.126 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.127 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.386 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:35.386 "name": "raid_bdev1", 00:23:35.386 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:35.386 "strip_size_kb": 0, 00:23:35.386 "state": "online", 00:23:35.386 "raid_level": "raid1", 00:23:35.386 "superblock": true, 00:23:35.386 "num_base_bdevs": 4, 00:23:35.386 "num_base_bdevs_discovered": 2, 00:23:35.386 "num_base_bdevs_operational": 2, 00:23:35.386 "base_bdevs_list": [ 00:23:35.386 { 00:23:35.386 "name": null, 00:23:35.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.386 "is_configured": false, 00:23:35.386 "data_offset": 2048, 00:23:35.386 "data_size": 63488 00:23:35.386 }, 00:23:35.386 { 00:23:35.386 "name": null, 00:23:35.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.386 "is_configured": false, 00:23:35.386 "data_offset": 2048, 00:23:35.386 "data_size": 63488 00:23:35.386 }, 00:23:35.386 { 00:23:35.386 "name": "BaseBdev3", 00:23:35.386 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:35.386 "is_configured": true, 00:23:35.386 "data_offset": 2048, 00:23:35.386 "data_size": 63488 00:23:35.386 }, 00:23:35.386 { 00:23:35.386 "name": "BaseBdev4", 00:23:35.386 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:35.386 "is_configured": true, 00:23:35.386 "data_offset": 2048, 00:23:35.386 "data_size": 63488 00:23:35.386 } 00:23:35.386 ] 00:23:35.386 }' 00:23:35.386 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:35.386 17:35:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:35.955 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:35.955 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.955 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:35.955 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:35.955 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.955 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.955 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.214 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.214 "name": "raid_bdev1", 00:23:36.214 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:36.214 "strip_size_kb": 0, 00:23:36.214 "state": "online", 00:23:36.214 "raid_level": "raid1", 00:23:36.214 "superblock": true, 00:23:36.214 "num_base_bdevs": 4, 00:23:36.214 "num_base_bdevs_discovered": 2, 00:23:36.214 "num_base_bdevs_operational": 2, 00:23:36.214 "base_bdevs_list": [ 00:23:36.214 { 00:23:36.214 "name": null, 00:23:36.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.214 "is_configured": false, 00:23:36.214 "data_offset": 2048, 00:23:36.214 "data_size": 63488 00:23:36.214 }, 00:23:36.214 { 00:23:36.214 "name": null, 00:23:36.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.214 "is_configured": false, 00:23:36.214 "data_offset": 2048, 00:23:36.214 "data_size": 63488 00:23:36.214 }, 00:23:36.214 { 00:23:36.214 "name": "BaseBdev3", 00:23:36.214 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:36.214 "is_configured": true, 00:23:36.214 "data_offset": 2048, 00:23:36.214 "data_size": 63488 00:23:36.214 }, 00:23:36.214 { 00:23:36.214 "name": "BaseBdev4", 00:23:36.214 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:36.214 "is_configured": true, 00:23:36.214 "data_offset": 2048, 00:23:36.214 "data_size": 63488 00:23:36.214 } 00:23:36.214 ] 00:23:36.214 }' 00:23:36.214 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.214 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:36.214 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.214 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:36.214 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:36.214 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:36.215 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:36.474 [2024-07-15 17:35:47.591059] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:36.474 [2024-07-15 17:35:47.591145] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:23:36.474 [2024-07-15 17:35:47.591154] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:36.474 request: 00:23:36.474 { 00:23:36.474 "base_bdev": "BaseBdev1", 00:23:36.474 "raid_bdev": "raid_bdev1", 00:23:36.474 "method": "bdev_raid_add_base_bdev", 00:23:36.474 "req_id": 1 00:23:36.474 } 00:23:36.474 Got JSON-RPC error response 00:23:36.475 response: 00:23:36.475 { 00:23:36.475 "code": -22, 00:23:36.475 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:36.475 } 00:23:36.475 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:23:36.475 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:36.475 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:36.475 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:36.475 17:35:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.414 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.674 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:37.674 "name": "raid_bdev1", 00:23:37.674 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:37.674 "strip_size_kb": 0, 00:23:37.674 "state": "online", 00:23:37.674 "raid_level": "raid1", 00:23:37.674 "superblock": true, 00:23:37.674 "num_base_bdevs": 4, 00:23:37.674 "num_base_bdevs_discovered": 2, 00:23:37.674 "num_base_bdevs_operational": 2, 00:23:37.674 "base_bdevs_list": [ 00:23:37.674 { 00:23:37.674 "name": null, 00:23:37.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.674 "is_configured": false, 00:23:37.674 "data_offset": 2048, 00:23:37.674 "data_size": 63488 00:23:37.674 }, 00:23:37.674 { 00:23:37.674 "name": null, 00:23:37.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.674 "is_configured": false, 00:23:37.674 "data_offset": 2048, 00:23:37.674 "data_size": 63488 00:23:37.674 }, 00:23:37.674 { 00:23:37.674 "name": "BaseBdev3", 00:23:37.674 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:37.674 "is_configured": true, 00:23:37.674 "data_offset": 2048, 00:23:37.674 "data_size": 63488 00:23:37.674 }, 00:23:37.674 { 00:23:37.674 "name": "BaseBdev4", 00:23:37.674 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:37.674 "is_configured": true, 00:23:37.674 "data_offset": 2048, 00:23:37.674 "data_size": 63488 00:23:37.674 } 00:23:37.674 ] 00:23:37.674 }' 00:23:37.674 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:37.674 17:35:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:38.244 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:38.244 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:38.244 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:38.244 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:38.244 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:38.244 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.244 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.244 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:38.244 "name": "raid_bdev1", 00:23:38.244 "uuid": "85057e46-cc29-4581-89e4-8328f1211f67", 00:23:38.244 "strip_size_kb": 0, 00:23:38.244 "state": "online", 00:23:38.244 "raid_level": "raid1", 00:23:38.244 "superblock": true, 00:23:38.244 "num_base_bdevs": 4, 00:23:38.244 "num_base_bdevs_discovered": 2, 00:23:38.244 "num_base_bdevs_operational": 2, 00:23:38.244 "base_bdevs_list": [ 00:23:38.244 { 00:23:38.244 "name": null, 00:23:38.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.244 "is_configured": false, 00:23:38.244 "data_offset": 2048, 00:23:38.244 "data_size": 63488 00:23:38.244 }, 00:23:38.244 { 00:23:38.245 "name": null, 00:23:38.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.245 "is_configured": false, 00:23:38.245 "data_offset": 2048, 00:23:38.245 "data_size": 63488 00:23:38.245 }, 00:23:38.245 { 00:23:38.245 "name": "BaseBdev3", 00:23:38.245 "uuid": "d0470fb9-68f1-51b6-b36a-741188d31865", 00:23:38.245 "is_configured": true, 00:23:38.245 "data_offset": 2048, 00:23:38.245 "data_size": 63488 00:23:38.245 }, 00:23:38.245 { 00:23:38.245 "name": "BaseBdev4", 00:23:38.245 "uuid": "7d4215ed-4403-5832-b132-8b9fb8fa6457", 00:23:38.245 "is_configured": true, 00:23:38.245 "data_offset": 2048, 00:23:38.245 "data_size": 63488 00:23:38.245 } 00:23:38.245 ] 00:23:38.245 }' 00:23:38.245 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2887284 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2887284 ']' 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2887284 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2887284 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2887284' 00:23:38.506 killing process with pid 2887284 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2887284 00:23:38.506 Received shutdown signal, test time was about 24.782232 seconds 00:23:38.506 00:23:38.506 Latency(us) 00:23:38.506 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:38.506 =================================================================================================================== 00:23:38.506 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:38.506 [2024-07-15 17:35:49.665128] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:38.506 [2024-07-15 17:35:49.665200] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:38.506 [2024-07-15 17:35:49.665242] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:38.506 [2024-07-15 17:35:49.665248] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2040190 name raid_bdev1, state offline 00:23:38.506 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2887284 00:23:38.506 [2024-07-15 17:35:49.688245] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:38.767 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:38.767 00:23:38.767 real 0m29.278s 00:23:38.767 user 0m46.526s 00:23:38.767 sys 0m3.500s 00:23:38.767 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:38.767 17:35:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:38.767 ************************************ 00:23:38.767 END TEST raid_rebuild_test_sb_io 00:23:38.767 ************************************ 00:23:38.767 17:35:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:38.767 17:35:49 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:23:38.767 17:35:49 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:23:38.767 17:35:49 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:23:38.767 17:35:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:38.767 17:35:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:38.767 17:35:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:38.767 ************************************ 00:23:38.767 START TEST raid_state_function_test_sb_4k 00:23:38.767 ************************************ 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=2892428 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2892428' 00:23:38.767 Process raid pid: 2892428 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 2892428 /var/tmp/spdk-raid.sock 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2892428 ']' 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:38.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:38.767 17:35:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:38.767 [2024-07-15 17:35:49.954071] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:23:38.767 [2024-07-15 17:35:49.954122] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:38.767 [2024-07-15 17:35:50.047464] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:39.027 [2024-07-15 17:35:50.119966] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:39.027 [2024-07-15 17:35:50.169810] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:39.027 [2024-07-15 17:35:50.169830] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:39.597 17:35:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:39.597 17:35:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:23:39.597 17:35:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:39.858 [2024-07-15 17:35:50.989311] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:39.858 [2024-07-15 17:35:50.989342] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:39.858 [2024-07-15 17:35:50.989348] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:39.858 [2024-07-15 17:35:50.989354] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:39.858 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:39.858 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:39.858 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:39.858 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.858 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.858 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:39.859 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.859 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.859 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.859 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.859 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.859 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:40.121 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.121 "name": "Existed_Raid", 00:23:40.121 "uuid": "e0bd2097-ad37-4772-b264-bb054e90907a", 00:23:40.121 "strip_size_kb": 0, 00:23:40.121 "state": "configuring", 00:23:40.121 "raid_level": "raid1", 00:23:40.121 "superblock": true, 00:23:40.121 "num_base_bdevs": 2, 00:23:40.121 "num_base_bdevs_discovered": 0, 00:23:40.121 "num_base_bdevs_operational": 2, 00:23:40.121 "base_bdevs_list": [ 00:23:40.121 { 00:23:40.121 "name": "BaseBdev1", 00:23:40.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.121 "is_configured": false, 00:23:40.121 "data_offset": 0, 00:23:40.121 "data_size": 0 00:23:40.121 }, 00:23:40.121 { 00:23:40.121 "name": "BaseBdev2", 00:23:40.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.121 "is_configured": false, 00:23:40.121 "data_offset": 0, 00:23:40.121 "data_size": 0 00:23:40.121 } 00:23:40.121 ] 00:23:40.121 }' 00:23:40.121 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.121 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:40.691 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:40.691 [2024-07-15 17:35:51.939605] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:40.691 [2024-07-15 17:35:51.939626] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x226a6b0 name Existed_Raid, state configuring 00:23:40.691 17:35:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:40.951 [2024-07-15 17:35:52.132132] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:40.951 [2024-07-15 17:35:52.132149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:40.951 [2024-07-15 17:35:52.132154] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:40.951 [2024-07-15 17:35:52.132159] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:40.951 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:23:41.211 [2024-07-15 17:35:52.315298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:41.211 BaseBdev1 00:23:41.211 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:41.211 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:23:41.211 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:41.211 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:23:41.211 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:41.211 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:41.211 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:41.211 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:41.472 [ 00:23:41.472 { 00:23:41.472 "name": "BaseBdev1", 00:23:41.472 "aliases": [ 00:23:41.472 "a08fd05b-0ae3-4e98-8051-b7b46ea53124" 00:23:41.472 ], 00:23:41.472 "product_name": "Malloc disk", 00:23:41.472 "block_size": 4096, 00:23:41.472 "num_blocks": 8192, 00:23:41.472 "uuid": "a08fd05b-0ae3-4e98-8051-b7b46ea53124", 00:23:41.472 "assigned_rate_limits": { 00:23:41.472 "rw_ios_per_sec": 0, 00:23:41.472 "rw_mbytes_per_sec": 0, 00:23:41.472 "r_mbytes_per_sec": 0, 00:23:41.472 "w_mbytes_per_sec": 0 00:23:41.472 }, 00:23:41.472 "claimed": true, 00:23:41.472 "claim_type": "exclusive_write", 00:23:41.472 "zoned": false, 00:23:41.472 "supported_io_types": { 00:23:41.472 "read": true, 00:23:41.472 "write": true, 00:23:41.472 "unmap": true, 00:23:41.472 "flush": true, 00:23:41.472 "reset": true, 00:23:41.472 "nvme_admin": false, 00:23:41.472 "nvme_io": false, 00:23:41.472 "nvme_io_md": false, 00:23:41.472 "write_zeroes": true, 00:23:41.472 "zcopy": true, 00:23:41.472 "get_zone_info": false, 00:23:41.472 "zone_management": false, 00:23:41.472 "zone_append": false, 00:23:41.472 "compare": false, 00:23:41.472 "compare_and_write": false, 00:23:41.472 "abort": true, 00:23:41.472 "seek_hole": false, 00:23:41.472 "seek_data": false, 00:23:41.472 "copy": true, 00:23:41.472 "nvme_iov_md": false 00:23:41.472 }, 00:23:41.472 "memory_domains": [ 00:23:41.472 { 00:23:41.472 "dma_device_id": "system", 00:23:41.472 "dma_device_type": 1 00:23:41.472 }, 00:23:41.472 { 00:23:41.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.472 "dma_device_type": 2 00:23:41.472 } 00:23:41.472 ], 00:23:41.472 "driver_specific": {} 00:23:41.472 } 00:23:41.472 ] 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.472 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:41.733 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:41.733 "name": "Existed_Raid", 00:23:41.733 "uuid": "26cf3df7-3497-4557-914c-ed8553de69dd", 00:23:41.733 "strip_size_kb": 0, 00:23:41.733 "state": "configuring", 00:23:41.733 "raid_level": "raid1", 00:23:41.733 "superblock": true, 00:23:41.733 "num_base_bdevs": 2, 00:23:41.733 "num_base_bdevs_discovered": 1, 00:23:41.733 "num_base_bdevs_operational": 2, 00:23:41.733 "base_bdevs_list": [ 00:23:41.733 { 00:23:41.733 "name": "BaseBdev1", 00:23:41.733 "uuid": "a08fd05b-0ae3-4e98-8051-b7b46ea53124", 00:23:41.733 "is_configured": true, 00:23:41.733 "data_offset": 256, 00:23:41.733 "data_size": 7936 00:23:41.733 }, 00:23:41.733 { 00:23:41.733 "name": "BaseBdev2", 00:23:41.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:41.733 "is_configured": false, 00:23:41.733 "data_offset": 0, 00:23:41.733 "data_size": 0 00:23:41.733 } 00:23:41.733 ] 00:23:41.733 }' 00:23:41.733 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:41.733 17:35:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:42.303 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:42.564 [2024-07-15 17:35:53.614580] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:42.564 [2024-07-15 17:35:53.614604] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2269fa0 name Existed_Raid, state configuring 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:42.564 [2024-07-15 17:35:53.799069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:42.564 [2024-07-15 17:35:53.800189] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:42.564 [2024-07-15 17:35:53.800213] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.564 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:42.825 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:42.825 "name": "Existed_Raid", 00:23:42.825 "uuid": "3c218756-b4ad-45d0-9a86-db38e605274a", 00:23:42.825 "strip_size_kb": 0, 00:23:42.825 "state": "configuring", 00:23:42.825 "raid_level": "raid1", 00:23:42.825 "superblock": true, 00:23:42.825 "num_base_bdevs": 2, 00:23:42.825 "num_base_bdevs_discovered": 1, 00:23:42.825 "num_base_bdevs_operational": 2, 00:23:42.825 "base_bdevs_list": [ 00:23:42.825 { 00:23:42.825 "name": "BaseBdev1", 00:23:42.825 "uuid": "a08fd05b-0ae3-4e98-8051-b7b46ea53124", 00:23:42.825 "is_configured": true, 00:23:42.825 "data_offset": 256, 00:23:42.825 "data_size": 7936 00:23:42.825 }, 00:23:42.825 { 00:23:42.825 "name": "BaseBdev2", 00:23:42.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:42.825 "is_configured": false, 00:23:42.825 "data_offset": 0, 00:23:42.825 "data_size": 0 00:23:42.825 } 00:23:42.825 ] 00:23:42.825 }' 00:23:42.825 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:42.825 17:35:53 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:43.395 17:35:54 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:23:43.655 [2024-07-15 17:35:54.714209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:43.655 [2024-07-15 17:35:54.714313] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x226ad90 00:23:43.655 [2024-07-15 17:35:54.714321] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:43.655 [2024-07-15 17:35:54.714458] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x241e8d0 00:23:43.655 [2024-07-15 17:35:54.714549] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x226ad90 00:23:43.655 [2024-07-15 17:35:54.714555] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x226ad90 00:23:43.655 [2024-07-15 17:35:54.714621] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:43.655 BaseBdev2 00:23:43.655 17:35:54 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:43.655 17:35:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:23:43.655 17:35:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:43.655 17:35:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:23:43.655 17:35:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:43.655 17:35:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:43.655 17:35:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:43.655 17:35:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:43.916 [ 00:23:43.916 { 00:23:43.916 "name": "BaseBdev2", 00:23:43.916 "aliases": [ 00:23:43.916 "5aeab9cb-268a-438a-905a-be231202d169" 00:23:43.916 ], 00:23:43.916 "product_name": "Malloc disk", 00:23:43.916 "block_size": 4096, 00:23:43.916 "num_blocks": 8192, 00:23:43.916 "uuid": "5aeab9cb-268a-438a-905a-be231202d169", 00:23:43.916 "assigned_rate_limits": { 00:23:43.916 "rw_ios_per_sec": 0, 00:23:43.916 "rw_mbytes_per_sec": 0, 00:23:43.916 "r_mbytes_per_sec": 0, 00:23:43.916 "w_mbytes_per_sec": 0 00:23:43.916 }, 00:23:43.916 "claimed": true, 00:23:43.916 "claim_type": "exclusive_write", 00:23:43.916 "zoned": false, 00:23:43.916 "supported_io_types": { 00:23:43.916 "read": true, 00:23:43.916 "write": true, 00:23:43.916 "unmap": true, 00:23:43.916 "flush": true, 00:23:43.916 "reset": true, 00:23:43.916 "nvme_admin": false, 00:23:43.916 "nvme_io": false, 00:23:43.916 "nvme_io_md": false, 00:23:43.916 "write_zeroes": true, 00:23:43.916 "zcopy": true, 00:23:43.916 "get_zone_info": false, 00:23:43.916 "zone_management": false, 00:23:43.916 "zone_append": false, 00:23:43.916 "compare": false, 00:23:43.916 "compare_and_write": false, 00:23:43.916 "abort": true, 00:23:43.916 "seek_hole": false, 00:23:43.916 "seek_data": false, 00:23:43.916 "copy": true, 00:23:43.916 "nvme_iov_md": false 00:23:43.916 }, 00:23:43.916 "memory_domains": [ 00:23:43.916 { 00:23:43.916 "dma_device_id": "system", 00:23:43.916 "dma_device_type": 1 00:23:43.916 }, 00:23:43.916 { 00:23:43.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:43.916 "dma_device_type": 2 00:23:43.916 } 00:23:43.916 ], 00:23:43.916 "driver_specific": {} 00:23:43.916 } 00:23:43.916 ] 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.916 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:44.187 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:44.187 "name": "Existed_Raid", 00:23:44.187 "uuid": "3c218756-b4ad-45d0-9a86-db38e605274a", 00:23:44.187 "strip_size_kb": 0, 00:23:44.187 "state": "online", 00:23:44.187 "raid_level": "raid1", 00:23:44.187 "superblock": true, 00:23:44.187 "num_base_bdevs": 2, 00:23:44.187 "num_base_bdevs_discovered": 2, 00:23:44.187 "num_base_bdevs_operational": 2, 00:23:44.187 "base_bdevs_list": [ 00:23:44.187 { 00:23:44.187 "name": "BaseBdev1", 00:23:44.187 "uuid": "a08fd05b-0ae3-4e98-8051-b7b46ea53124", 00:23:44.187 "is_configured": true, 00:23:44.187 "data_offset": 256, 00:23:44.187 "data_size": 7936 00:23:44.187 }, 00:23:44.187 { 00:23:44.187 "name": "BaseBdev2", 00:23:44.187 "uuid": "5aeab9cb-268a-438a-905a-be231202d169", 00:23:44.187 "is_configured": true, 00:23:44.187 "data_offset": 256, 00:23:44.187 "data_size": 7936 00:23:44.187 } 00:23:44.187 ] 00:23:44.187 }' 00:23:44.187 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:44.187 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:44.758 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:44.758 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:44.758 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:44.758 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:44.758 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:44.758 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:23:44.758 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:44.758 17:35:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:44.758 [2024-07-15 17:35:56.017703] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:44.758 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:44.758 "name": "Existed_Raid", 00:23:44.758 "aliases": [ 00:23:44.758 "3c218756-b4ad-45d0-9a86-db38e605274a" 00:23:44.758 ], 00:23:44.758 "product_name": "Raid Volume", 00:23:44.758 "block_size": 4096, 00:23:44.758 "num_blocks": 7936, 00:23:44.758 "uuid": "3c218756-b4ad-45d0-9a86-db38e605274a", 00:23:44.758 "assigned_rate_limits": { 00:23:44.758 "rw_ios_per_sec": 0, 00:23:44.758 "rw_mbytes_per_sec": 0, 00:23:44.758 "r_mbytes_per_sec": 0, 00:23:44.758 "w_mbytes_per_sec": 0 00:23:44.758 }, 00:23:44.758 "claimed": false, 00:23:44.758 "zoned": false, 00:23:44.758 "supported_io_types": { 00:23:44.758 "read": true, 00:23:44.758 "write": true, 00:23:44.758 "unmap": false, 00:23:44.758 "flush": false, 00:23:44.758 "reset": true, 00:23:44.758 "nvme_admin": false, 00:23:44.758 "nvme_io": false, 00:23:44.758 "nvme_io_md": false, 00:23:44.758 "write_zeroes": true, 00:23:44.758 "zcopy": false, 00:23:44.758 "get_zone_info": false, 00:23:44.758 "zone_management": false, 00:23:44.758 "zone_append": false, 00:23:44.758 "compare": false, 00:23:44.758 "compare_and_write": false, 00:23:44.758 "abort": false, 00:23:44.758 "seek_hole": false, 00:23:44.758 "seek_data": false, 00:23:44.758 "copy": false, 00:23:44.759 "nvme_iov_md": false 00:23:44.759 }, 00:23:44.759 "memory_domains": [ 00:23:44.759 { 00:23:44.759 "dma_device_id": "system", 00:23:44.759 "dma_device_type": 1 00:23:44.759 }, 00:23:44.759 { 00:23:44.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:44.759 "dma_device_type": 2 00:23:44.759 }, 00:23:44.759 { 00:23:44.759 "dma_device_id": "system", 00:23:44.759 "dma_device_type": 1 00:23:44.759 }, 00:23:44.759 { 00:23:44.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:44.759 "dma_device_type": 2 00:23:44.759 } 00:23:44.759 ], 00:23:44.759 "driver_specific": { 00:23:44.759 "raid": { 00:23:44.759 "uuid": "3c218756-b4ad-45d0-9a86-db38e605274a", 00:23:44.759 "strip_size_kb": 0, 00:23:44.759 "state": "online", 00:23:44.759 "raid_level": "raid1", 00:23:44.759 "superblock": true, 00:23:44.759 "num_base_bdevs": 2, 00:23:44.759 "num_base_bdevs_discovered": 2, 00:23:44.759 "num_base_bdevs_operational": 2, 00:23:44.759 "base_bdevs_list": [ 00:23:44.759 { 00:23:44.759 "name": "BaseBdev1", 00:23:44.759 "uuid": "a08fd05b-0ae3-4e98-8051-b7b46ea53124", 00:23:44.759 "is_configured": true, 00:23:44.759 "data_offset": 256, 00:23:44.759 "data_size": 7936 00:23:44.759 }, 00:23:44.759 { 00:23:44.759 "name": "BaseBdev2", 00:23:44.759 "uuid": "5aeab9cb-268a-438a-905a-be231202d169", 00:23:44.759 "is_configured": true, 00:23:44.759 "data_offset": 256, 00:23:44.759 "data_size": 7936 00:23:44.759 } 00:23:44.759 ] 00:23:44.759 } 00:23:44.759 } 00:23:44.759 }' 00:23:44.759 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:45.019 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:45.019 BaseBdev2' 00:23:45.019 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:45.019 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:45.019 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:45.019 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:45.019 "name": "BaseBdev1", 00:23:45.019 "aliases": [ 00:23:45.019 "a08fd05b-0ae3-4e98-8051-b7b46ea53124" 00:23:45.019 ], 00:23:45.019 "product_name": "Malloc disk", 00:23:45.019 "block_size": 4096, 00:23:45.019 "num_blocks": 8192, 00:23:45.019 "uuid": "a08fd05b-0ae3-4e98-8051-b7b46ea53124", 00:23:45.019 "assigned_rate_limits": { 00:23:45.019 "rw_ios_per_sec": 0, 00:23:45.019 "rw_mbytes_per_sec": 0, 00:23:45.019 "r_mbytes_per_sec": 0, 00:23:45.019 "w_mbytes_per_sec": 0 00:23:45.019 }, 00:23:45.019 "claimed": true, 00:23:45.019 "claim_type": "exclusive_write", 00:23:45.019 "zoned": false, 00:23:45.019 "supported_io_types": { 00:23:45.019 "read": true, 00:23:45.019 "write": true, 00:23:45.019 "unmap": true, 00:23:45.019 "flush": true, 00:23:45.019 "reset": true, 00:23:45.019 "nvme_admin": false, 00:23:45.019 "nvme_io": false, 00:23:45.019 "nvme_io_md": false, 00:23:45.019 "write_zeroes": true, 00:23:45.019 "zcopy": true, 00:23:45.019 "get_zone_info": false, 00:23:45.019 "zone_management": false, 00:23:45.019 "zone_append": false, 00:23:45.019 "compare": false, 00:23:45.019 "compare_and_write": false, 00:23:45.019 "abort": true, 00:23:45.019 "seek_hole": false, 00:23:45.020 "seek_data": false, 00:23:45.020 "copy": true, 00:23:45.020 "nvme_iov_md": false 00:23:45.020 }, 00:23:45.020 "memory_domains": [ 00:23:45.020 { 00:23:45.020 "dma_device_id": "system", 00:23:45.020 "dma_device_type": 1 00:23:45.020 }, 00:23:45.020 { 00:23:45.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:45.020 "dma_device_type": 2 00:23:45.020 } 00:23:45.020 ], 00:23:45.020 "driver_specific": {} 00:23:45.020 }' 00:23:45.020 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:45.280 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:45.540 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:45.540 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:45.540 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:45.540 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:45.540 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:45.540 "name": "BaseBdev2", 00:23:45.540 "aliases": [ 00:23:45.540 "5aeab9cb-268a-438a-905a-be231202d169" 00:23:45.540 ], 00:23:45.540 "product_name": "Malloc disk", 00:23:45.540 "block_size": 4096, 00:23:45.540 "num_blocks": 8192, 00:23:45.540 "uuid": "5aeab9cb-268a-438a-905a-be231202d169", 00:23:45.540 "assigned_rate_limits": { 00:23:45.540 "rw_ios_per_sec": 0, 00:23:45.540 "rw_mbytes_per_sec": 0, 00:23:45.540 "r_mbytes_per_sec": 0, 00:23:45.540 "w_mbytes_per_sec": 0 00:23:45.540 }, 00:23:45.540 "claimed": true, 00:23:45.540 "claim_type": "exclusive_write", 00:23:45.540 "zoned": false, 00:23:45.540 "supported_io_types": { 00:23:45.540 "read": true, 00:23:45.540 "write": true, 00:23:45.540 "unmap": true, 00:23:45.540 "flush": true, 00:23:45.540 "reset": true, 00:23:45.540 "nvme_admin": false, 00:23:45.540 "nvme_io": false, 00:23:45.540 "nvme_io_md": false, 00:23:45.540 "write_zeroes": true, 00:23:45.540 "zcopy": true, 00:23:45.540 "get_zone_info": false, 00:23:45.540 "zone_management": false, 00:23:45.540 "zone_append": false, 00:23:45.540 "compare": false, 00:23:45.540 "compare_and_write": false, 00:23:45.540 "abort": true, 00:23:45.540 "seek_hole": false, 00:23:45.540 "seek_data": false, 00:23:45.540 "copy": true, 00:23:45.540 "nvme_iov_md": false 00:23:45.540 }, 00:23:45.540 "memory_domains": [ 00:23:45.540 { 00:23:45.540 "dma_device_id": "system", 00:23:45.540 "dma_device_type": 1 00:23:45.540 }, 00:23:45.540 { 00:23:45.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:45.540 "dma_device_type": 2 00:23:45.540 } 00:23:45.540 ], 00:23:45.540 "driver_specific": {} 00:23:45.540 }' 00:23:45.540 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:45.800 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:45.800 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:45.800 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:45.800 17:35:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:45.800 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:45.800 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:45.800 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:45.800 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:45.800 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:46.060 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:46.060 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:46.060 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:46.319 [2024-07-15 17:35:57.360936] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:46.320 "name": "Existed_Raid", 00:23:46.320 "uuid": "3c218756-b4ad-45d0-9a86-db38e605274a", 00:23:46.320 "strip_size_kb": 0, 00:23:46.320 "state": "online", 00:23:46.320 "raid_level": "raid1", 00:23:46.320 "superblock": true, 00:23:46.320 "num_base_bdevs": 2, 00:23:46.320 "num_base_bdevs_discovered": 1, 00:23:46.320 "num_base_bdevs_operational": 1, 00:23:46.320 "base_bdevs_list": [ 00:23:46.320 { 00:23:46.320 "name": null, 00:23:46.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:46.320 "is_configured": false, 00:23:46.320 "data_offset": 256, 00:23:46.320 "data_size": 7936 00:23:46.320 }, 00:23:46.320 { 00:23:46.320 "name": "BaseBdev2", 00:23:46.320 "uuid": "5aeab9cb-268a-438a-905a-be231202d169", 00:23:46.320 "is_configured": true, 00:23:46.320 "data_offset": 256, 00:23:46.320 "data_size": 7936 00:23:46.320 } 00:23:46.320 ] 00:23:46.320 }' 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:46.320 17:35:57 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:46.889 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:46.889 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:46.889 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.889 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:47.148 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:47.148 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:47.148 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:47.407 [2024-07-15 17:35:58.459832] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:47.407 [2024-07-15 17:35:58.459892] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:47.407 [2024-07-15 17:35:58.465936] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:47.408 [2024-07-15 17:35:58.465960] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:47.408 [2024-07-15 17:35:58.465966] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x226ad90 name Existed_Raid, state offline 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 2892428 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2892428 ']' 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2892428 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:47.408 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2892428 00:23:47.668 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:47.668 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:47.668 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2892428' 00:23:47.668 killing process with pid 2892428 00:23:47.668 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2892428 00:23:47.668 [2024-07-15 17:35:58.724507] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:47.668 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2892428 00:23:47.668 [2024-07-15 17:35:58.725105] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:47.668 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:23:47.668 00:23:47.668 real 0m8.950s 00:23:47.668 user 0m16.201s 00:23:47.668 sys 0m1.413s 00:23:47.668 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:47.668 17:35:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:47.668 ************************************ 00:23:47.668 END TEST raid_state_function_test_sb_4k 00:23:47.668 ************************************ 00:23:47.668 17:35:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:47.668 17:35:58 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:23:47.668 17:35:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:23:47.668 17:35:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:47.668 17:35:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:47.668 ************************************ 00:23:47.668 START TEST raid_superblock_test_4k 00:23:47.668 ************************************ 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=2894135 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 2894135 /var/tmp/spdk-raid.sock 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 2894135 ']' 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:47.668 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:47.668 17:35:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:47.928 [2024-07-15 17:35:58.972830] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:23:47.928 [2024-07-15 17:35:58.972880] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2894135 ] 00:23:47.928 [2024-07-15 17:35:59.062485] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:47.928 [2024-07-15 17:35:59.131282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:47.928 [2024-07-15 17:35:59.182994] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:47.928 [2024-07-15 17:35:59.183019] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:48.868 17:35:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:23:48.868 malloc1 00:23:48.868 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:49.127 [2024-07-15 17:36:00.174128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:49.127 [2024-07-15 17:36:00.174174] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:49.127 [2024-07-15 17:36:00.174188] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d82a20 00:23:49.127 [2024-07-15 17:36:00.174194] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:49.127 [2024-07-15 17:36:00.175571] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:49.127 [2024-07-15 17:36:00.175591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:49.127 pt1 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:23:49.127 malloc2 00:23:49.127 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:49.386 [2024-07-15 17:36:00.561182] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:49.386 [2024-07-15 17:36:00.561209] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:49.386 [2024-07-15 17:36:00.561220] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d83040 00:23:49.386 [2024-07-15 17:36:00.561227] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:49.386 [2024-07-15 17:36:00.562426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:49.386 [2024-07-15 17:36:00.562444] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:49.386 pt2 00:23:49.386 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:49.386 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:49.386 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:23:49.646 [2024-07-15 17:36:00.753676] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:49.646 [2024-07-15 17:36:00.754687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:49.646 [2024-07-15 17:36:00.754803] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f2f3d0 00:23:49.646 [2024-07-15 17:36:00.754811] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:49.646 [2024-07-15 17:36:00.754959] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d99910 00:23:49.646 [2024-07-15 17:36:00.755067] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f2f3d0 00:23:49.646 [2024-07-15 17:36:00.755073] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f2f3d0 00:23:49.646 [2024-07-15 17:36:00.755148] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.646 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.906 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:49.906 "name": "raid_bdev1", 00:23:49.906 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:49.906 "strip_size_kb": 0, 00:23:49.906 "state": "online", 00:23:49.906 "raid_level": "raid1", 00:23:49.906 "superblock": true, 00:23:49.906 "num_base_bdevs": 2, 00:23:49.906 "num_base_bdevs_discovered": 2, 00:23:49.906 "num_base_bdevs_operational": 2, 00:23:49.906 "base_bdevs_list": [ 00:23:49.906 { 00:23:49.906 "name": "pt1", 00:23:49.906 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:49.906 "is_configured": true, 00:23:49.906 "data_offset": 256, 00:23:49.906 "data_size": 7936 00:23:49.906 }, 00:23:49.906 { 00:23:49.906 "name": "pt2", 00:23:49.906 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:49.906 "is_configured": true, 00:23:49.906 "data_offset": 256, 00:23:49.906 "data_size": 7936 00:23:49.906 } 00:23:49.906 ] 00:23:49.906 }' 00:23:49.906 17:36:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:49.906 17:36:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:50.476 [2024-07-15 17:36:01.680209] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:50.476 "name": "raid_bdev1", 00:23:50.476 "aliases": [ 00:23:50.476 "c374a549-d394-47a6-9381-e6df452ce657" 00:23:50.476 ], 00:23:50.476 "product_name": "Raid Volume", 00:23:50.476 "block_size": 4096, 00:23:50.476 "num_blocks": 7936, 00:23:50.476 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:50.476 "assigned_rate_limits": { 00:23:50.476 "rw_ios_per_sec": 0, 00:23:50.476 "rw_mbytes_per_sec": 0, 00:23:50.476 "r_mbytes_per_sec": 0, 00:23:50.476 "w_mbytes_per_sec": 0 00:23:50.476 }, 00:23:50.476 "claimed": false, 00:23:50.476 "zoned": false, 00:23:50.476 "supported_io_types": { 00:23:50.476 "read": true, 00:23:50.476 "write": true, 00:23:50.476 "unmap": false, 00:23:50.476 "flush": false, 00:23:50.476 "reset": true, 00:23:50.476 "nvme_admin": false, 00:23:50.476 "nvme_io": false, 00:23:50.476 "nvme_io_md": false, 00:23:50.476 "write_zeroes": true, 00:23:50.476 "zcopy": false, 00:23:50.476 "get_zone_info": false, 00:23:50.476 "zone_management": false, 00:23:50.476 "zone_append": false, 00:23:50.476 "compare": false, 00:23:50.476 "compare_and_write": false, 00:23:50.476 "abort": false, 00:23:50.476 "seek_hole": false, 00:23:50.476 "seek_data": false, 00:23:50.476 "copy": false, 00:23:50.476 "nvme_iov_md": false 00:23:50.476 }, 00:23:50.476 "memory_domains": [ 00:23:50.476 { 00:23:50.476 "dma_device_id": "system", 00:23:50.476 "dma_device_type": 1 00:23:50.476 }, 00:23:50.476 { 00:23:50.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:50.476 "dma_device_type": 2 00:23:50.476 }, 00:23:50.476 { 00:23:50.476 "dma_device_id": "system", 00:23:50.476 "dma_device_type": 1 00:23:50.476 }, 00:23:50.476 { 00:23:50.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:50.476 "dma_device_type": 2 00:23:50.476 } 00:23:50.476 ], 00:23:50.476 "driver_specific": { 00:23:50.476 "raid": { 00:23:50.476 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:50.476 "strip_size_kb": 0, 00:23:50.476 "state": "online", 00:23:50.476 "raid_level": "raid1", 00:23:50.476 "superblock": true, 00:23:50.476 "num_base_bdevs": 2, 00:23:50.476 "num_base_bdevs_discovered": 2, 00:23:50.476 "num_base_bdevs_operational": 2, 00:23:50.476 "base_bdevs_list": [ 00:23:50.476 { 00:23:50.476 "name": "pt1", 00:23:50.476 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:50.476 "is_configured": true, 00:23:50.476 "data_offset": 256, 00:23:50.476 "data_size": 7936 00:23:50.476 }, 00:23:50.476 { 00:23:50.476 "name": "pt2", 00:23:50.476 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:50.476 "is_configured": true, 00:23:50.476 "data_offset": 256, 00:23:50.476 "data_size": 7936 00:23:50.476 } 00:23:50.476 ] 00:23:50.476 } 00:23:50.476 } 00:23:50.476 }' 00:23:50.476 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:50.477 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:50.477 pt2' 00:23:50.477 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:50.477 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:50.477 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:50.736 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:50.736 "name": "pt1", 00:23:50.736 "aliases": [ 00:23:50.736 "00000000-0000-0000-0000-000000000001" 00:23:50.736 ], 00:23:50.736 "product_name": "passthru", 00:23:50.736 "block_size": 4096, 00:23:50.737 "num_blocks": 8192, 00:23:50.737 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:50.737 "assigned_rate_limits": { 00:23:50.737 "rw_ios_per_sec": 0, 00:23:50.737 "rw_mbytes_per_sec": 0, 00:23:50.737 "r_mbytes_per_sec": 0, 00:23:50.737 "w_mbytes_per_sec": 0 00:23:50.737 }, 00:23:50.737 "claimed": true, 00:23:50.737 "claim_type": "exclusive_write", 00:23:50.737 "zoned": false, 00:23:50.737 "supported_io_types": { 00:23:50.737 "read": true, 00:23:50.737 "write": true, 00:23:50.737 "unmap": true, 00:23:50.737 "flush": true, 00:23:50.737 "reset": true, 00:23:50.737 "nvme_admin": false, 00:23:50.737 "nvme_io": false, 00:23:50.737 "nvme_io_md": false, 00:23:50.737 "write_zeroes": true, 00:23:50.737 "zcopy": true, 00:23:50.737 "get_zone_info": false, 00:23:50.737 "zone_management": false, 00:23:50.737 "zone_append": false, 00:23:50.737 "compare": false, 00:23:50.737 "compare_and_write": false, 00:23:50.737 "abort": true, 00:23:50.737 "seek_hole": false, 00:23:50.737 "seek_data": false, 00:23:50.737 "copy": true, 00:23:50.737 "nvme_iov_md": false 00:23:50.737 }, 00:23:50.737 "memory_domains": [ 00:23:50.737 { 00:23:50.737 "dma_device_id": "system", 00:23:50.737 "dma_device_type": 1 00:23:50.737 }, 00:23:50.737 { 00:23:50.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:50.737 "dma_device_type": 2 00:23:50.737 } 00:23:50.737 ], 00:23:50.737 "driver_specific": { 00:23:50.737 "passthru": { 00:23:50.737 "name": "pt1", 00:23:50.737 "base_bdev_name": "malloc1" 00:23:50.737 } 00:23:50.737 } 00:23:50.737 }' 00:23:50.737 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:50.737 17:36:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:50.737 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:50.737 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:50.997 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:51.257 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:51.257 "name": "pt2", 00:23:51.257 "aliases": [ 00:23:51.257 "00000000-0000-0000-0000-000000000002" 00:23:51.257 ], 00:23:51.257 "product_name": "passthru", 00:23:51.257 "block_size": 4096, 00:23:51.257 "num_blocks": 8192, 00:23:51.257 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:51.257 "assigned_rate_limits": { 00:23:51.257 "rw_ios_per_sec": 0, 00:23:51.257 "rw_mbytes_per_sec": 0, 00:23:51.257 "r_mbytes_per_sec": 0, 00:23:51.257 "w_mbytes_per_sec": 0 00:23:51.257 }, 00:23:51.257 "claimed": true, 00:23:51.257 "claim_type": "exclusive_write", 00:23:51.257 "zoned": false, 00:23:51.257 "supported_io_types": { 00:23:51.257 "read": true, 00:23:51.257 "write": true, 00:23:51.257 "unmap": true, 00:23:51.257 "flush": true, 00:23:51.257 "reset": true, 00:23:51.257 "nvme_admin": false, 00:23:51.257 "nvme_io": false, 00:23:51.257 "nvme_io_md": false, 00:23:51.257 "write_zeroes": true, 00:23:51.257 "zcopy": true, 00:23:51.257 "get_zone_info": false, 00:23:51.257 "zone_management": false, 00:23:51.257 "zone_append": false, 00:23:51.257 "compare": false, 00:23:51.257 "compare_and_write": false, 00:23:51.257 "abort": true, 00:23:51.257 "seek_hole": false, 00:23:51.258 "seek_data": false, 00:23:51.258 "copy": true, 00:23:51.258 "nvme_iov_md": false 00:23:51.258 }, 00:23:51.258 "memory_domains": [ 00:23:51.258 { 00:23:51.258 "dma_device_id": "system", 00:23:51.258 "dma_device_type": 1 00:23:51.258 }, 00:23:51.258 { 00:23:51.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:51.258 "dma_device_type": 2 00:23:51.258 } 00:23:51.258 ], 00:23:51.258 "driver_specific": { 00:23:51.258 "passthru": { 00:23:51.258 "name": "pt2", 00:23:51.258 "base_bdev_name": "malloc2" 00:23:51.258 } 00:23:51.258 } 00:23:51.258 }' 00:23:51.258 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:51.258 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:51.258 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:51.258 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:51.518 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:23:51.779 [2024-07-15 17:36:02.963452] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:51.779 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c374a549-d394-47a6-9381-e6df452ce657 00:23:51.779 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z c374a549-d394-47a6-9381-e6df452ce657 ']' 00:23:51.779 17:36:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:52.039 [2024-07-15 17:36:03.159748] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:52.039 [2024-07-15 17:36:03.159759] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:52.039 [2024-07-15 17:36:03.159794] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:52.039 [2024-07-15 17:36:03.159836] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:52.039 [2024-07-15 17:36:03.159842] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2f3d0 name raid_bdev1, state offline 00:23:52.039 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.039 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:23:52.300 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:23:52.300 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:23:52.300 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:52.300 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:52.300 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:52.300 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:52.561 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:52.561 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:52.822 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:52.823 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:52.823 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:52.823 17:36:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:52.823 [2024-07-15 17:36:04.114129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:52.823 [2024-07-15 17:36:04.115188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:52.823 [2024-07-15 17:36:04.115228] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:52.823 [2024-07-15 17:36:04.115255] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:52.823 [2024-07-15 17:36:04.115270] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:52.823 [2024-07-15 17:36:04.115275] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2d960 name raid_bdev1, state configuring 00:23:52.823 request: 00:23:52.823 { 00:23:52.823 "name": "raid_bdev1", 00:23:52.823 "raid_level": "raid1", 00:23:52.823 "base_bdevs": [ 00:23:52.823 "malloc1", 00:23:52.823 "malloc2" 00:23:52.823 ], 00:23:52.823 "superblock": false, 00:23:52.823 "method": "bdev_raid_create", 00:23:52.823 "req_id": 1 00:23:52.823 } 00:23:52.823 Got JSON-RPC error response 00:23:52.823 response: 00:23:52.823 { 00:23:52.823 "code": -17, 00:23:52.823 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:52.823 } 00:23:53.083 17:36:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:23:53.083 17:36:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:53.083 17:36:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:53.083 17:36:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:53.083 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.083 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:53.083 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:53.083 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:53.083 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:53.343 [2024-07-15 17:36:04.483021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:53.343 [2024-07-15 17:36:04.483043] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:53.343 [2024-07-15 17:36:04.483054] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2eba0 00:23:53.343 [2024-07-15 17:36:04.483060] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:53.343 [2024-07-15 17:36:04.484296] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:53.343 [2024-07-15 17:36:04.484314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:53.344 [2024-07-15 17:36:04.484355] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:53.344 [2024-07-15 17:36:04.484372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:53.344 pt1 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.344 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.604 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.604 "name": "raid_bdev1", 00:23:53.604 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:53.604 "strip_size_kb": 0, 00:23:53.604 "state": "configuring", 00:23:53.604 "raid_level": "raid1", 00:23:53.604 "superblock": true, 00:23:53.604 "num_base_bdevs": 2, 00:23:53.604 "num_base_bdevs_discovered": 1, 00:23:53.604 "num_base_bdevs_operational": 2, 00:23:53.604 "base_bdevs_list": [ 00:23:53.604 { 00:23:53.604 "name": "pt1", 00:23:53.604 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:53.604 "is_configured": true, 00:23:53.604 "data_offset": 256, 00:23:53.604 "data_size": 7936 00:23:53.604 }, 00:23:53.604 { 00:23:53.604 "name": null, 00:23:53.604 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:53.604 "is_configured": false, 00:23:53.604 "data_offset": 256, 00:23:53.604 "data_size": 7936 00:23:53.604 } 00:23:53.604 ] 00:23:53.604 }' 00:23:53.604 17:36:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.604 17:36:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:54.175 [2024-07-15 17:36:05.397346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:54.175 [2024-07-15 17:36:05.397373] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:54.175 [2024-07-15 17:36:05.397382] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d83e00 00:23:54.175 [2024-07-15 17:36:05.397388] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:54.175 [2024-07-15 17:36:05.397635] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:54.175 [2024-07-15 17:36:05.397646] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:54.175 [2024-07-15 17:36:05.397684] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:54.175 [2024-07-15 17:36:05.397696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:54.175 [2024-07-15 17:36:05.397777] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d81820 00:23:54.175 [2024-07-15 17:36:05.397783] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:54.175 [2024-07-15 17:36:05.397917] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f30e90 00:23:54.175 [2024-07-15 17:36:05.398015] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d81820 00:23:54.175 [2024-07-15 17:36:05.398020] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d81820 00:23:54.175 [2024-07-15 17:36:05.398094] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:54.175 pt2 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.175 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.435 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:54.435 "name": "raid_bdev1", 00:23:54.435 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:54.435 "strip_size_kb": 0, 00:23:54.435 "state": "online", 00:23:54.435 "raid_level": "raid1", 00:23:54.435 "superblock": true, 00:23:54.435 "num_base_bdevs": 2, 00:23:54.435 "num_base_bdevs_discovered": 2, 00:23:54.435 "num_base_bdevs_operational": 2, 00:23:54.435 "base_bdevs_list": [ 00:23:54.435 { 00:23:54.435 "name": "pt1", 00:23:54.435 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:54.435 "is_configured": true, 00:23:54.435 "data_offset": 256, 00:23:54.435 "data_size": 7936 00:23:54.435 }, 00:23:54.435 { 00:23:54.435 "name": "pt2", 00:23:54.435 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:54.435 "is_configured": true, 00:23:54.435 "data_offset": 256, 00:23:54.435 "data_size": 7936 00:23:54.435 } 00:23:54.435 ] 00:23:54.435 }' 00:23:54.435 17:36:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:54.435 17:36:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:55.004 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:55.004 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:55.005 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:55.005 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:55.005 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:55.005 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:23:55.005 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:55.005 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:55.265 [2024-07-15 17:36:06.311923] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:55.265 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:55.265 "name": "raid_bdev1", 00:23:55.265 "aliases": [ 00:23:55.265 "c374a549-d394-47a6-9381-e6df452ce657" 00:23:55.265 ], 00:23:55.265 "product_name": "Raid Volume", 00:23:55.265 "block_size": 4096, 00:23:55.265 "num_blocks": 7936, 00:23:55.265 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:55.265 "assigned_rate_limits": { 00:23:55.265 "rw_ios_per_sec": 0, 00:23:55.265 "rw_mbytes_per_sec": 0, 00:23:55.265 "r_mbytes_per_sec": 0, 00:23:55.265 "w_mbytes_per_sec": 0 00:23:55.265 }, 00:23:55.265 "claimed": false, 00:23:55.265 "zoned": false, 00:23:55.265 "supported_io_types": { 00:23:55.265 "read": true, 00:23:55.265 "write": true, 00:23:55.265 "unmap": false, 00:23:55.265 "flush": false, 00:23:55.265 "reset": true, 00:23:55.265 "nvme_admin": false, 00:23:55.265 "nvme_io": false, 00:23:55.265 "nvme_io_md": false, 00:23:55.265 "write_zeroes": true, 00:23:55.265 "zcopy": false, 00:23:55.265 "get_zone_info": false, 00:23:55.265 "zone_management": false, 00:23:55.265 "zone_append": false, 00:23:55.265 "compare": false, 00:23:55.265 "compare_and_write": false, 00:23:55.265 "abort": false, 00:23:55.265 "seek_hole": false, 00:23:55.265 "seek_data": false, 00:23:55.265 "copy": false, 00:23:55.265 "nvme_iov_md": false 00:23:55.265 }, 00:23:55.265 "memory_domains": [ 00:23:55.265 { 00:23:55.265 "dma_device_id": "system", 00:23:55.265 "dma_device_type": 1 00:23:55.265 }, 00:23:55.265 { 00:23:55.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:55.265 "dma_device_type": 2 00:23:55.265 }, 00:23:55.265 { 00:23:55.265 "dma_device_id": "system", 00:23:55.265 "dma_device_type": 1 00:23:55.265 }, 00:23:55.265 { 00:23:55.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:55.265 "dma_device_type": 2 00:23:55.265 } 00:23:55.265 ], 00:23:55.265 "driver_specific": { 00:23:55.265 "raid": { 00:23:55.265 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:55.265 "strip_size_kb": 0, 00:23:55.265 "state": "online", 00:23:55.265 "raid_level": "raid1", 00:23:55.265 "superblock": true, 00:23:55.265 "num_base_bdevs": 2, 00:23:55.265 "num_base_bdevs_discovered": 2, 00:23:55.265 "num_base_bdevs_operational": 2, 00:23:55.265 "base_bdevs_list": [ 00:23:55.265 { 00:23:55.265 "name": "pt1", 00:23:55.265 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:55.265 "is_configured": true, 00:23:55.265 "data_offset": 256, 00:23:55.265 "data_size": 7936 00:23:55.265 }, 00:23:55.265 { 00:23:55.265 "name": "pt2", 00:23:55.265 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:55.265 "is_configured": true, 00:23:55.265 "data_offset": 256, 00:23:55.265 "data_size": 7936 00:23:55.265 } 00:23:55.265 ] 00:23:55.265 } 00:23:55.265 } 00:23:55.265 }' 00:23:55.265 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:55.265 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:55.265 pt2' 00:23:55.265 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:55.266 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:55.266 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:55.266 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:55.266 "name": "pt1", 00:23:55.266 "aliases": [ 00:23:55.266 "00000000-0000-0000-0000-000000000001" 00:23:55.266 ], 00:23:55.266 "product_name": "passthru", 00:23:55.266 "block_size": 4096, 00:23:55.266 "num_blocks": 8192, 00:23:55.266 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:55.266 "assigned_rate_limits": { 00:23:55.266 "rw_ios_per_sec": 0, 00:23:55.266 "rw_mbytes_per_sec": 0, 00:23:55.266 "r_mbytes_per_sec": 0, 00:23:55.266 "w_mbytes_per_sec": 0 00:23:55.266 }, 00:23:55.266 "claimed": true, 00:23:55.266 "claim_type": "exclusive_write", 00:23:55.266 "zoned": false, 00:23:55.266 "supported_io_types": { 00:23:55.266 "read": true, 00:23:55.266 "write": true, 00:23:55.266 "unmap": true, 00:23:55.266 "flush": true, 00:23:55.266 "reset": true, 00:23:55.266 "nvme_admin": false, 00:23:55.266 "nvme_io": false, 00:23:55.266 "nvme_io_md": false, 00:23:55.266 "write_zeroes": true, 00:23:55.266 "zcopy": true, 00:23:55.266 "get_zone_info": false, 00:23:55.266 "zone_management": false, 00:23:55.266 "zone_append": false, 00:23:55.266 "compare": false, 00:23:55.266 "compare_and_write": false, 00:23:55.266 "abort": true, 00:23:55.266 "seek_hole": false, 00:23:55.266 "seek_data": false, 00:23:55.266 "copy": true, 00:23:55.266 "nvme_iov_md": false 00:23:55.266 }, 00:23:55.266 "memory_domains": [ 00:23:55.266 { 00:23:55.266 "dma_device_id": "system", 00:23:55.266 "dma_device_type": 1 00:23:55.266 }, 00:23:55.266 { 00:23:55.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:55.266 "dma_device_type": 2 00:23:55.266 } 00:23:55.266 ], 00:23:55.266 "driver_specific": { 00:23:55.266 "passthru": { 00:23:55.266 "name": "pt1", 00:23:55.266 "base_bdev_name": "malloc1" 00:23:55.266 } 00:23:55.266 } 00:23:55.266 }' 00:23:55.266 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:55.525 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:55.525 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:55.525 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:55.525 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:55.525 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:55.525 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:55.525 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:55.525 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:55.525 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:55.785 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:55.785 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:55.785 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:55.785 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:55.785 17:36:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:56.045 "name": "pt2", 00:23:56.045 "aliases": [ 00:23:56.045 "00000000-0000-0000-0000-000000000002" 00:23:56.045 ], 00:23:56.045 "product_name": "passthru", 00:23:56.045 "block_size": 4096, 00:23:56.045 "num_blocks": 8192, 00:23:56.045 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:56.045 "assigned_rate_limits": { 00:23:56.045 "rw_ios_per_sec": 0, 00:23:56.045 "rw_mbytes_per_sec": 0, 00:23:56.045 "r_mbytes_per_sec": 0, 00:23:56.045 "w_mbytes_per_sec": 0 00:23:56.045 }, 00:23:56.045 "claimed": true, 00:23:56.045 "claim_type": "exclusive_write", 00:23:56.045 "zoned": false, 00:23:56.045 "supported_io_types": { 00:23:56.045 "read": true, 00:23:56.045 "write": true, 00:23:56.045 "unmap": true, 00:23:56.045 "flush": true, 00:23:56.045 "reset": true, 00:23:56.045 "nvme_admin": false, 00:23:56.045 "nvme_io": false, 00:23:56.045 "nvme_io_md": false, 00:23:56.045 "write_zeroes": true, 00:23:56.045 "zcopy": true, 00:23:56.045 "get_zone_info": false, 00:23:56.045 "zone_management": false, 00:23:56.045 "zone_append": false, 00:23:56.045 "compare": false, 00:23:56.045 "compare_and_write": false, 00:23:56.045 "abort": true, 00:23:56.045 "seek_hole": false, 00:23:56.045 "seek_data": false, 00:23:56.045 "copy": true, 00:23:56.045 "nvme_iov_md": false 00:23:56.045 }, 00:23:56.045 "memory_domains": [ 00:23:56.045 { 00:23:56.045 "dma_device_id": "system", 00:23:56.045 "dma_device_type": 1 00:23:56.045 }, 00:23:56.045 { 00:23:56.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.045 "dma_device_type": 2 00:23:56.045 } 00:23:56.045 ], 00:23:56.045 "driver_specific": { 00:23:56.045 "passthru": { 00:23:56.045 "name": "pt2", 00:23:56.045 "base_bdev_name": "malloc2" 00:23:56.045 } 00:23:56.045 } 00:23:56.045 }' 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:56.045 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:56.305 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:56.305 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:56.305 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:56.305 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:56.564 [2024-07-15 17:36:07.607198] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' c374a549-d394-47a6-9381-e6df452ce657 '!=' c374a549-d394-47a6-9381-e6df452ce657 ']' 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:56.564 [2024-07-15 17:36:07.783460] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:56.564 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:56.565 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:56.565 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:56.565 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:56.565 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:56.565 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:56.565 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.565 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.824 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:56.824 "name": "raid_bdev1", 00:23:56.824 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:56.824 "strip_size_kb": 0, 00:23:56.824 "state": "online", 00:23:56.824 "raid_level": "raid1", 00:23:56.824 "superblock": true, 00:23:56.824 "num_base_bdevs": 2, 00:23:56.824 "num_base_bdevs_discovered": 1, 00:23:56.824 "num_base_bdevs_operational": 1, 00:23:56.824 "base_bdevs_list": [ 00:23:56.824 { 00:23:56.824 "name": null, 00:23:56.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.824 "is_configured": false, 00:23:56.824 "data_offset": 256, 00:23:56.824 "data_size": 7936 00:23:56.824 }, 00:23:56.824 { 00:23:56.824 "name": "pt2", 00:23:56.824 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:56.825 "is_configured": true, 00:23:56.825 "data_offset": 256, 00:23:56.825 "data_size": 7936 00:23:56.825 } 00:23:56.825 ] 00:23:56.825 }' 00:23:56.825 17:36:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:56.825 17:36:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:57.394 17:36:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:57.654 [2024-07-15 17:36:08.697757] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:57.654 [2024-07-15 17:36:08.697774] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:57.654 [2024-07-15 17:36:08.697807] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:57.654 [2024-07-15 17:36:08.697833] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:57.654 [2024-07-15 17:36:08.697839] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d81820 name raid_bdev1, state offline 00:23:57.654 17:36:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.654 17:36:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:57.654 17:36:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:57.654 17:36:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:57.654 17:36:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:57.654 17:36:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:57.654 17:36:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:57.914 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:57.914 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:57.914 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:57.914 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:57.914 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:23:57.914 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:58.195 [2024-07-15 17:36:09.279207] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:58.195 [2024-07-15 17:36:09.279234] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:58.195 [2024-07-15 17:36:09.279243] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2bd70 00:23:58.195 [2024-07-15 17:36:09.279249] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:58.195 [2024-07-15 17:36:09.280509] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:58.195 [2024-07-15 17:36:09.280528] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:58.195 [2024-07-15 17:36:09.280570] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:58.195 [2024-07-15 17:36:09.280588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:58.195 [2024-07-15 17:36:09.280651] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f30ae0 00:23:58.195 [2024-07-15 17:36:09.280662] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:58.196 [2024-07-15 17:36:09.280808] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f2e160 00:23:58.196 [2024-07-15 17:36:09.280905] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f30ae0 00:23:58.196 [2024-07-15 17:36:09.280910] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f30ae0 00:23:58.196 [2024-07-15 17:36:09.280981] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:58.196 pt2 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.196 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.528 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:58.528 "name": "raid_bdev1", 00:23:58.528 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:58.528 "strip_size_kb": 0, 00:23:58.528 "state": "online", 00:23:58.528 "raid_level": "raid1", 00:23:58.528 "superblock": true, 00:23:58.528 "num_base_bdevs": 2, 00:23:58.528 "num_base_bdevs_discovered": 1, 00:23:58.528 "num_base_bdevs_operational": 1, 00:23:58.528 "base_bdevs_list": [ 00:23:58.528 { 00:23:58.528 "name": null, 00:23:58.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.528 "is_configured": false, 00:23:58.528 "data_offset": 256, 00:23:58.528 "data_size": 7936 00:23:58.528 }, 00:23:58.528 { 00:23:58.528 "name": "pt2", 00:23:58.528 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:58.528 "is_configured": true, 00:23:58.528 "data_offset": 256, 00:23:58.528 "data_size": 7936 00:23:58.528 } 00:23:58.528 ] 00:23:58.528 }' 00:23:58.528 17:36:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:58.528 17:36:09 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:58.789 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:59.049 [2024-07-15 17:36:10.197515] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:59.049 [2024-07-15 17:36:10.197534] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:59.049 [2024-07-15 17:36:10.197572] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:59.049 [2024-07-15 17:36:10.197601] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:59.049 [2024-07-15 17:36:10.197608] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f30ae0 name raid_bdev1, state offline 00:23:59.049 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.049 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:59.328 [2024-07-15 17:36:10.590493] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:59.328 [2024-07-15 17:36:10.590522] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:59.328 [2024-07-15 17:36:10.590532] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d833e0 00:23:59.328 [2024-07-15 17:36:10.590539] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:59.328 [2024-07-15 17:36:10.591807] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:59.328 [2024-07-15 17:36:10.591826] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:59.328 [2024-07-15 17:36:10.591873] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:59.328 [2024-07-15 17:36:10.591890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:59.328 [2024-07-15 17:36:10.591966] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:59.328 [2024-07-15 17:36:10.591973] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:59.328 [2024-07-15 17:36:10.591981] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f30fa0 name raid_bdev1, state configuring 00:23:59.328 [2024-07-15 17:36:10.591994] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:59.328 [2024-07-15 17:36:10.592034] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f30fa0 00:23:59.328 [2024-07-15 17:36:10.592039] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:59.328 [2024-07-15 17:36:10.592176] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f2d850 00:23:59.328 [2024-07-15 17:36:10.592271] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f30fa0 00:23:59.328 [2024-07-15 17:36:10.592276] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f30fa0 00:23:59.328 [2024-07-15 17:36:10.592351] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:59.328 pt1 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.328 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.329 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.589 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.589 "name": "raid_bdev1", 00:23:59.589 "uuid": "c374a549-d394-47a6-9381-e6df452ce657", 00:23:59.589 "strip_size_kb": 0, 00:23:59.589 "state": "online", 00:23:59.589 "raid_level": "raid1", 00:23:59.589 "superblock": true, 00:23:59.589 "num_base_bdevs": 2, 00:23:59.589 "num_base_bdevs_discovered": 1, 00:23:59.589 "num_base_bdevs_operational": 1, 00:23:59.589 "base_bdevs_list": [ 00:23:59.589 { 00:23:59.589 "name": null, 00:23:59.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.589 "is_configured": false, 00:23:59.589 "data_offset": 256, 00:23:59.589 "data_size": 7936 00:23:59.589 }, 00:23:59.589 { 00:23:59.589 "name": "pt2", 00:23:59.589 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:59.589 "is_configured": true, 00:23:59.589 "data_offset": 256, 00:23:59.589 "data_size": 7936 00:23:59.589 } 00:23:59.589 ] 00:23:59.589 }' 00:23:59.589 17:36:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.589 17:36:10 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:00.161 17:36:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:24:00.161 17:36:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:00.421 17:36:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:24:00.421 17:36:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:00.421 17:36:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:24:00.681 [2024-07-15 17:36:11.721521] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' c374a549-d394-47a6-9381-e6df452ce657 '!=' c374a549-d394-47a6-9381-e6df452ce657 ']' 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 2894135 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 2894135 ']' 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 2894135 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2894135 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2894135' 00:24:00.681 killing process with pid 2894135 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 2894135 00:24:00.681 [2024-07-15 17:36:11.788425] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:00.681 [2024-07-15 17:36:11.788460] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:00.681 [2024-07-15 17:36:11.788489] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:00.681 [2024-07-15 17:36:11.788494] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f30fa0 name raid_bdev1, state offline 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 2894135 00:24:00.681 [2024-07-15 17:36:11.797627] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:24:00.681 00:24:00.681 real 0m12.998s 00:24:00.681 user 0m24.028s 00:24:00.681 sys 0m2.020s 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:00.681 17:36:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:00.681 ************************************ 00:24:00.681 END TEST raid_superblock_test_4k 00:24:00.681 ************************************ 00:24:00.681 17:36:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:00.681 17:36:11 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:24:00.681 17:36:11 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:24:00.681 17:36:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:00.681 17:36:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:00.681 17:36:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:00.942 ************************************ 00:24:00.942 START TEST raid_rebuild_test_sb_4k 00:24:00.942 ************************************ 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:00.942 17:36:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=2897167 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 2897167 /var/tmp/spdk-raid.sock 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2897167 ']' 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:00.942 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:00.942 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:00.942 [2024-07-15 17:36:12.057800] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:24:00.942 [2024-07-15 17:36:12.057855] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2897167 ] 00:24:00.942 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:00.942 Zero copy mechanism will not be used. 00:24:00.942 [2024-07-15 17:36:12.147353] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:00.942 [2024-07-15 17:36:12.214283] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:01.203 [2024-07-15 17:36:12.256658] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:01.203 [2024-07-15 17:36:12.256681] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:01.772 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:01.772 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:24:01.772 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:01.772 17:36:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:24:02.031 BaseBdev1_malloc 00:24:02.031 17:36:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:02.031 [2024-07-15 17:36:13.255300] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:02.031 [2024-07-15 17:36:13.255333] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:02.031 [2024-07-15 17:36:13.255345] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb8d30 00:24:02.031 [2024-07-15 17:36:13.255351] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:02.031 [2024-07-15 17:36:13.256637] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:02.031 [2024-07-15 17:36:13.256655] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:02.031 BaseBdev1 00:24:02.031 17:36:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:02.031 17:36:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:24:02.291 BaseBdev2_malloc 00:24:02.291 17:36:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:02.551 [2024-07-15 17:36:13.622258] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:02.551 [2024-07-15 17:36:13.622288] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:02.551 [2024-07-15 17:36:13.622299] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d6bc60 00:24:02.551 [2024-07-15 17:36:13.622305] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:02.551 [2024-07-15 17:36:13.623500] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:02.551 [2024-07-15 17:36:13.623518] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:02.551 BaseBdev2 00:24:02.551 17:36:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:24:02.551 spare_malloc 00:24:02.551 17:36:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:02.810 spare_delay 00:24:02.810 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:03.071 [2024-07-15 17:36:14.213610] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:03.071 [2024-07-15 17:36:14.213639] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:03.071 [2024-07-15 17:36:14.213656] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d5bec0 00:24:03.071 [2024-07-15 17:36:14.213661] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:03.071 [2024-07-15 17:36:14.214852] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:03.071 [2024-07-15 17:36:14.214871] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:03.071 spare 00:24:03.071 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:03.333 [2024-07-15 17:36:14.406117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:03.333 [2024-07-15 17:36:14.407112] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:03.333 [2024-07-15 17:36:14.407226] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d531c0 00:24:03.333 [2024-07-15 17:36:14.407233] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:03.333 [2024-07-15 17:36:14.407376] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb1190 00:24:03.333 [2024-07-15 17:36:14.407486] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d531c0 00:24:03.333 [2024-07-15 17:36:14.407491] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d531c0 00:24:03.333 [2024-07-15 17:36:14.407559] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:03.333 "name": "raid_bdev1", 00:24:03.333 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:03.333 "strip_size_kb": 0, 00:24:03.333 "state": "online", 00:24:03.333 "raid_level": "raid1", 00:24:03.333 "superblock": true, 00:24:03.333 "num_base_bdevs": 2, 00:24:03.333 "num_base_bdevs_discovered": 2, 00:24:03.333 "num_base_bdevs_operational": 2, 00:24:03.333 "base_bdevs_list": [ 00:24:03.333 { 00:24:03.333 "name": "BaseBdev1", 00:24:03.333 "uuid": "4e6865a3-9029-5959-b169-c14cce8723bd", 00:24:03.333 "is_configured": true, 00:24:03.333 "data_offset": 256, 00:24:03.333 "data_size": 7936 00:24:03.333 }, 00:24:03.333 { 00:24:03.333 "name": "BaseBdev2", 00:24:03.333 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:03.333 "is_configured": true, 00:24:03.333 "data_offset": 256, 00:24:03.333 "data_size": 7936 00:24:03.333 } 00:24:03.333 ] 00:24:03.333 }' 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:03.333 17:36:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:03.904 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:03.904 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:04.164 [2024-07-15 17:36:15.328617] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:04.164 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:24:04.164 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.164 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:04.424 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:04.424 [2024-07-15 17:36:15.721449] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb1190 00:24:04.683 /dev/nbd0 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:04.683 1+0 records in 00:24:04.683 1+0 records out 00:24:04.683 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000184862 s, 22.2 MB/s 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:04.683 17:36:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:24:05.251 7936+0 records in 00:24:05.251 7936+0 records out 00:24:05.251 32505856 bytes (33 MB, 31 MiB) copied, 0.602159 s, 54.0 MB/s 00:24:05.251 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:05.251 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:05.251 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:05.251 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:05.251 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:24:05.251 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:05.251 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:05.511 [2024-07-15 17:36:16.570640] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:05.511 [2024-07-15 17:36:16.748457] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:05.511 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.512 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.771 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.771 "name": "raid_bdev1", 00:24:05.771 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:05.771 "strip_size_kb": 0, 00:24:05.771 "state": "online", 00:24:05.771 "raid_level": "raid1", 00:24:05.771 "superblock": true, 00:24:05.771 "num_base_bdevs": 2, 00:24:05.771 "num_base_bdevs_discovered": 1, 00:24:05.771 "num_base_bdevs_operational": 1, 00:24:05.771 "base_bdevs_list": [ 00:24:05.771 { 00:24:05.771 "name": null, 00:24:05.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.771 "is_configured": false, 00:24:05.771 "data_offset": 256, 00:24:05.771 "data_size": 7936 00:24:05.771 }, 00:24:05.771 { 00:24:05.771 "name": "BaseBdev2", 00:24:05.771 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:05.771 "is_configured": true, 00:24:05.771 "data_offset": 256, 00:24:05.771 "data_size": 7936 00:24:05.771 } 00:24:05.771 ] 00:24:05.771 }' 00:24:05.771 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.771 17:36:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:06.341 17:36:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:06.600 [2024-07-15 17:36:17.678804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:06.600 [2024-07-15 17:36:17.682243] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bba3b0 00:24:06.600 [2024-07-15 17:36:17.683821] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:06.600 17:36:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:07.538 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:07.538 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.538 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:07.538 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:07.538 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.538 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.538 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.797 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.797 "name": "raid_bdev1", 00:24:07.797 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:07.797 "strip_size_kb": 0, 00:24:07.797 "state": "online", 00:24:07.797 "raid_level": "raid1", 00:24:07.797 "superblock": true, 00:24:07.797 "num_base_bdevs": 2, 00:24:07.797 "num_base_bdevs_discovered": 2, 00:24:07.797 "num_base_bdevs_operational": 2, 00:24:07.797 "process": { 00:24:07.797 "type": "rebuild", 00:24:07.797 "target": "spare", 00:24:07.797 "progress": { 00:24:07.797 "blocks": 2816, 00:24:07.797 "percent": 35 00:24:07.797 } 00:24:07.797 }, 00:24:07.797 "base_bdevs_list": [ 00:24:07.797 { 00:24:07.797 "name": "spare", 00:24:07.797 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:07.797 "is_configured": true, 00:24:07.797 "data_offset": 256, 00:24:07.797 "data_size": 7936 00:24:07.797 }, 00:24:07.797 { 00:24:07.797 "name": "BaseBdev2", 00:24:07.797 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:07.797 "is_configured": true, 00:24:07.797 "data_offset": 256, 00:24:07.797 "data_size": 7936 00:24:07.797 } 00:24:07.797 ] 00:24:07.797 }' 00:24:07.797 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:07.797 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:07.797 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:07.797 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:07.797 17:36:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:08.056 [2024-07-15 17:36:19.164289] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:08.056 [2024-07-15 17:36:19.192671] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:08.056 [2024-07-15 17:36:19.192703] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:08.056 [2024-07-15 17:36:19.192718] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:08.056 [2024-07-15 17:36:19.192723] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.056 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.315 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:08.315 "name": "raid_bdev1", 00:24:08.315 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:08.315 "strip_size_kb": 0, 00:24:08.315 "state": "online", 00:24:08.315 "raid_level": "raid1", 00:24:08.315 "superblock": true, 00:24:08.315 "num_base_bdevs": 2, 00:24:08.315 "num_base_bdevs_discovered": 1, 00:24:08.315 "num_base_bdevs_operational": 1, 00:24:08.315 "base_bdevs_list": [ 00:24:08.315 { 00:24:08.315 "name": null, 00:24:08.315 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:08.315 "is_configured": false, 00:24:08.315 "data_offset": 256, 00:24:08.315 "data_size": 7936 00:24:08.315 }, 00:24:08.315 { 00:24:08.315 "name": "BaseBdev2", 00:24:08.315 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:08.315 "is_configured": true, 00:24:08.315 "data_offset": 256, 00:24:08.315 "data_size": 7936 00:24:08.315 } 00:24:08.315 ] 00:24:08.315 }' 00:24:08.315 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:08.315 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:08.884 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:08.884 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.884 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:08.884 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:08.884 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.884 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.884 17:36:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.884 17:36:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:08.884 "name": "raid_bdev1", 00:24:08.884 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:08.884 "strip_size_kb": 0, 00:24:08.884 "state": "online", 00:24:08.884 "raid_level": "raid1", 00:24:08.884 "superblock": true, 00:24:08.884 "num_base_bdevs": 2, 00:24:08.884 "num_base_bdevs_discovered": 1, 00:24:08.884 "num_base_bdevs_operational": 1, 00:24:08.884 "base_bdevs_list": [ 00:24:08.884 { 00:24:08.884 "name": null, 00:24:08.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:08.884 "is_configured": false, 00:24:08.884 "data_offset": 256, 00:24:08.884 "data_size": 7936 00:24:08.884 }, 00:24:08.884 { 00:24:08.884 "name": "BaseBdev2", 00:24:08.884 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:08.884 "is_configured": true, 00:24:08.884 "data_offset": 256, 00:24:08.884 "data_size": 7936 00:24:08.884 } 00:24:08.884 ] 00:24:08.884 }' 00:24:08.884 17:36:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.143 17:36:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:09.143 17:36:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.143 17:36:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:09.143 17:36:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:09.143 [2024-07-15 17:36:20.411753] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:09.143 [2024-07-15 17:36:20.415078] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bba3b0 00:24:09.143 [2024-07-15 17:36:20.416208] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:09.143 17:36:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:10.521 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.521 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.521 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:10.521 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:10.521 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.522 "name": "raid_bdev1", 00:24:10.522 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:10.522 "strip_size_kb": 0, 00:24:10.522 "state": "online", 00:24:10.522 "raid_level": "raid1", 00:24:10.522 "superblock": true, 00:24:10.522 "num_base_bdevs": 2, 00:24:10.522 "num_base_bdevs_discovered": 2, 00:24:10.522 "num_base_bdevs_operational": 2, 00:24:10.522 "process": { 00:24:10.522 "type": "rebuild", 00:24:10.522 "target": "spare", 00:24:10.522 "progress": { 00:24:10.522 "blocks": 2816, 00:24:10.522 "percent": 35 00:24:10.522 } 00:24:10.522 }, 00:24:10.522 "base_bdevs_list": [ 00:24:10.522 { 00:24:10.522 "name": "spare", 00:24:10.522 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:10.522 "is_configured": true, 00:24:10.522 "data_offset": 256, 00:24:10.522 "data_size": 7936 00:24:10.522 }, 00:24:10.522 { 00:24:10.522 "name": "BaseBdev2", 00:24:10.522 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:10.522 "is_configured": true, 00:24:10.522 "data_offset": 256, 00:24:10.522 "data_size": 7936 00:24:10.522 } 00:24:10.522 ] 00:24:10.522 }' 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:10.522 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=875 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.522 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.781 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.781 "name": "raid_bdev1", 00:24:10.781 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:10.781 "strip_size_kb": 0, 00:24:10.781 "state": "online", 00:24:10.781 "raid_level": "raid1", 00:24:10.781 "superblock": true, 00:24:10.781 "num_base_bdevs": 2, 00:24:10.781 "num_base_bdevs_discovered": 2, 00:24:10.781 "num_base_bdevs_operational": 2, 00:24:10.781 "process": { 00:24:10.781 "type": "rebuild", 00:24:10.781 "target": "spare", 00:24:10.781 "progress": { 00:24:10.781 "blocks": 3584, 00:24:10.781 "percent": 45 00:24:10.781 } 00:24:10.781 }, 00:24:10.781 "base_bdevs_list": [ 00:24:10.781 { 00:24:10.781 "name": "spare", 00:24:10.781 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:10.781 "is_configured": true, 00:24:10.781 "data_offset": 256, 00:24:10.781 "data_size": 7936 00:24:10.781 }, 00:24:10.781 { 00:24:10.781 "name": "BaseBdev2", 00:24:10.781 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:10.781 "is_configured": true, 00:24:10.781 "data_offset": 256, 00:24:10.781 "data_size": 7936 00:24:10.781 } 00:24:10.781 ] 00:24:10.781 }' 00:24:10.781 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.781 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:10.781 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.781 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:10.781 17:36:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:11.720 17:36:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:11.720 17:36:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:11.720 17:36:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.720 17:36:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:11.720 17:36:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:11.720 17:36:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.720 17:36:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.720 17:36:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.980 17:36:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.980 "name": "raid_bdev1", 00:24:11.980 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:11.980 "strip_size_kb": 0, 00:24:11.980 "state": "online", 00:24:11.980 "raid_level": "raid1", 00:24:11.980 "superblock": true, 00:24:11.980 "num_base_bdevs": 2, 00:24:11.980 "num_base_bdevs_discovered": 2, 00:24:11.980 "num_base_bdevs_operational": 2, 00:24:11.980 "process": { 00:24:11.980 "type": "rebuild", 00:24:11.980 "target": "spare", 00:24:11.980 "progress": { 00:24:11.980 "blocks": 6912, 00:24:11.980 "percent": 87 00:24:11.980 } 00:24:11.980 }, 00:24:11.980 "base_bdevs_list": [ 00:24:11.980 { 00:24:11.980 "name": "spare", 00:24:11.980 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:11.980 "is_configured": true, 00:24:11.980 "data_offset": 256, 00:24:11.980 "data_size": 7936 00:24:11.980 }, 00:24:11.980 { 00:24:11.980 "name": "BaseBdev2", 00:24:11.980 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:11.980 "is_configured": true, 00:24:11.980 "data_offset": 256, 00:24:11.980 "data_size": 7936 00:24:11.980 } 00:24:11.980 ] 00:24:11.980 }' 00:24:11.980 17:36:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.980 17:36:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:11.980 17:36:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.239 17:36:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:12.239 17:36:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:12.239 [2024-07-15 17:36:23.534400] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:12.239 [2024-07-15 17:36:23.534443] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:12.239 [2024-07-15 17:36:23.534509] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:13.177 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:13.177 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:13.177 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:13.177 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:13.177 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:13.177 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:13.177 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.177 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:13.455 "name": "raid_bdev1", 00:24:13.455 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:13.455 "strip_size_kb": 0, 00:24:13.455 "state": "online", 00:24:13.455 "raid_level": "raid1", 00:24:13.455 "superblock": true, 00:24:13.455 "num_base_bdevs": 2, 00:24:13.455 "num_base_bdevs_discovered": 2, 00:24:13.455 "num_base_bdevs_operational": 2, 00:24:13.455 "base_bdevs_list": [ 00:24:13.455 { 00:24:13.455 "name": "spare", 00:24:13.455 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:13.455 "is_configured": true, 00:24:13.455 "data_offset": 256, 00:24:13.455 "data_size": 7936 00:24:13.455 }, 00:24:13.455 { 00:24:13.455 "name": "BaseBdev2", 00:24:13.455 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:13.455 "is_configured": true, 00:24:13.455 "data_offset": 256, 00:24:13.455 "data_size": 7936 00:24:13.455 } 00:24:13.455 ] 00:24:13.455 }' 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.455 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:13.725 "name": "raid_bdev1", 00:24:13.725 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:13.725 "strip_size_kb": 0, 00:24:13.725 "state": "online", 00:24:13.725 "raid_level": "raid1", 00:24:13.725 "superblock": true, 00:24:13.725 "num_base_bdevs": 2, 00:24:13.725 "num_base_bdevs_discovered": 2, 00:24:13.725 "num_base_bdevs_operational": 2, 00:24:13.725 "base_bdevs_list": [ 00:24:13.725 { 00:24:13.725 "name": "spare", 00:24:13.725 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:13.725 "is_configured": true, 00:24:13.725 "data_offset": 256, 00:24:13.725 "data_size": 7936 00:24:13.725 }, 00:24:13.725 { 00:24:13.725 "name": "BaseBdev2", 00:24:13.725 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:13.725 "is_configured": true, 00:24:13.725 "data_offset": 256, 00:24:13.725 "data_size": 7936 00:24:13.725 } 00:24:13.725 ] 00:24:13.725 }' 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.725 17:36:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.986 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:13.986 "name": "raid_bdev1", 00:24:13.986 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:13.986 "strip_size_kb": 0, 00:24:13.986 "state": "online", 00:24:13.986 "raid_level": "raid1", 00:24:13.986 "superblock": true, 00:24:13.986 "num_base_bdevs": 2, 00:24:13.986 "num_base_bdevs_discovered": 2, 00:24:13.986 "num_base_bdevs_operational": 2, 00:24:13.986 "base_bdevs_list": [ 00:24:13.986 { 00:24:13.986 "name": "spare", 00:24:13.986 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:13.986 "is_configured": true, 00:24:13.986 "data_offset": 256, 00:24:13.986 "data_size": 7936 00:24:13.986 }, 00:24:13.986 { 00:24:13.986 "name": "BaseBdev2", 00:24:13.986 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:13.986 "is_configured": true, 00:24:13.986 "data_offset": 256, 00:24:13.986 "data_size": 7936 00:24:13.986 } 00:24:13.986 ] 00:24:13.986 }' 00:24:13.986 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:13.986 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:14.558 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:14.558 [2024-07-15 17:36:25.764052] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:14.558 [2024-07-15 17:36:25.764073] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:14.558 [2024-07-15 17:36:25.764116] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:14.558 [2024-07-15 17:36:25.764156] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:14.558 [2024-07-15 17:36:25.764162] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d531c0 name raid_bdev1, state offline 00:24:14.558 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.558 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:24:14.818 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:14.818 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:14.818 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:14.818 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:14.818 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:14.818 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:14.818 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:14.818 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:14.818 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:14.819 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:24:14.819 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:14.819 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:14.819 17:36:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:15.079 /dev/nbd0 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:15.079 1+0 records in 00:24:15.079 1+0 records out 00:24:15.079 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232753 s, 17.6 MB/s 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:15.079 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:15.340 /dev/nbd1 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:15.340 1+0 records in 00:24:15.340 1+0 records out 00:24:15.340 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000181502 s, 22.6 MB/s 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:15.340 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:15.600 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:15.861 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:15.861 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:15.861 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:15.861 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:15.861 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:15.861 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:24:15.861 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:24:15.861 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:15.861 17:36:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:15.861 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:16.121 [2024-07-15 17:36:27.271956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:16.121 [2024-07-15 17:36:27.271990] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:16.121 [2024-07-15 17:36:27.272007] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb82c0 00:24:16.121 [2024-07-15 17:36:27.272014] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:16.121 [2024-07-15 17:36:27.273389] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:16.121 [2024-07-15 17:36:27.273412] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:16.121 [2024-07-15 17:36:27.273470] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:16.121 [2024-07-15 17:36:27.273490] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:16.121 [2024-07-15 17:36:27.273567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:16.121 spare 00:24:16.121 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:16.121 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.122 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.122 [2024-07-15 17:36:27.373855] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d51330 00:24:16.122 [2024-07-15 17:36:27.373863] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:16.122 [2024-07-15 17:36:27.374015] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d50e40 00:24:16.122 [2024-07-15 17:36:27.374129] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d51330 00:24:16.122 [2024-07-15 17:36:27.374135] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d51330 00:24:16.122 [2024-07-15 17:36:27.374213] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:16.382 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:16.382 "name": "raid_bdev1", 00:24:16.382 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:16.382 "strip_size_kb": 0, 00:24:16.382 "state": "online", 00:24:16.382 "raid_level": "raid1", 00:24:16.382 "superblock": true, 00:24:16.382 "num_base_bdevs": 2, 00:24:16.382 "num_base_bdevs_discovered": 2, 00:24:16.382 "num_base_bdevs_operational": 2, 00:24:16.382 "base_bdevs_list": [ 00:24:16.382 { 00:24:16.382 "name": "spare", 00:24:16.382 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:16.382 "is_configured": true, 00:24:16.382 "data_offset": 256, 00:24:16.382 "data_size": 7936 00:24:16.382 }, 00:24:16.382 { 00:24:16.382 "name": "BaseBdev2", 00:24:16.382 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:16.382 "is_configured": true, 00:24:16.382 "data_offset": 256, 00:24:16.382 "data_size": 7936 00:24:16.382 } 00:24:16.382 ] 00:24:16.382 }' 00:24:16.382 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:16.382 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:16.953 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:16.953 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:16.953 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:16.953 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:16.953 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:16.953 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.953 17:36:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.953 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:16.953 "name": "raid_bdev1", 00:24:16.953 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:16.953 "strip_size_kb": 0, 00:24:16.953 "state": "online", 00:24:16.953 "raid_level": "raid1", 00:24:16.953 "superblock": true, 00:24:16.953 "num_base_bdevs": 2, 00:24:16.953 "num_base_bdevs_discovered": 2, 00:24:16.953 "num_base_bdevs_operational": 2, 00:24:16.953 "base_bdevs_list": [ 00:24:16.953 { 00:24:16.953 "name": "spare", 00:24:16.953 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:16.953 "is_configured": true, 00:24:16.953 "data_offset": 256, 00:24:16.953 "data_size": 7936 00:24:16.953 }, 00:24:16.953 { 00:24:16.953 "name": "BaseBdev2", 00:24:16.953 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:16.953 "is_configured": true, 00:24:16.953 "data_offset": 256, 00:24:16.953 "data_size": 7936 00:24:16.953 } 00:24:16.953 ] 00:24:16.953 }' 00:24:16.953 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:16.953 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:16.953 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:17.213 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:17.213 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.213 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:17.213 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:17.213 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:17.474 [2024-07-15 17:36:28.627437] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.474 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.734 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.734 "name": "raid_bdev1", 00:24:17.734 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:17.734 "strip_size_kb": 0, 00:24:17.734 "state": "online", 00:24:17.734 "raid_level": "raid1", 00:24:17.734 "superblock": true, 00:24:17.734 "num_base_bdevs": 2, 00:24:17.734 "num_base_bdevs_discovered": 1, 00:24:17.734 "num_base_bdevs_operational": 1, 00:24:17.734 "base_bdevs_list": [ 00:24:17.734 { 00:24:17.734 "name": null, 00:24:17.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.734 "is_configured": false, 00:24:17.735 "data_offset": 256, 00:24:17.735 "data_size": 7936 00:24:17.735 }, 00:24:17.735 { 00:24:17.735 "name": "BaseBdev2", 00:24:17.735 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:17.735 "is_configured": true, 00:24:17.735 "data_offset": 256, 00:24:17.735 "data_size": 7936 00:24:17.735 } 00:24:17.735 ] 00:24:17.735 }' 00:24:17.735 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.735 17:36:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:18.305 17:36:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:18.305 [2024-07-15 17:36:29.545773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:18.305 [2024-07-15 17:36:29.545871] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:18.305 [2024-07-15 17:36:29.545880] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:18.305 [2024-07-15 17:36:29.545900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:18.305 [2024-07-15 17:36:29.549212] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb77d0 00:24:18.305 [2024-07-15 17:36:29.550791] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:18.305 17:36:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:19.686 "name": "raid_bdev1", 00:24:19.686 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:19.686 "strip_size_kb": 0, 00:24:19.686 "state": "online", 00:24:19.686 "raid_level": "raid1", 00:24:19.686 "superblock": true, 00:24:19.686 "num_base_bdevs": 2, 00:24:19.686 "num_base_bdevs_discovered": 2, 00:24:19.686 "num_base_bdevs_operational": 2, 00:24:19.686 "process": { 00:24:19.686 "type": "rebuild", 00:24:19.686 "target": "spare", 00:24:19.686 "progress": { 00:24:19.686 "blocks": 2816, 00:24:19.686 "percent": 35 00:24:19.686 } 00:24:19.686 }, 00:24:19.686 "base_bdevs_list": [ 00:24:19.686 { 00:24:19.686 "name": "spare", 00:24:19.686 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:19.686 "is_configured": true, 00:24:19.686 "data_offset": 256, 00:24:19.686 "data_size": 7936 00:24:19.686 }, 00:24:19.686 { 00:24:19.686 "name": "BaseBdev2", 00:24:19.686 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:19.686 "is_configured": true, 00:24:19.686 "data_offset": 256, 00:24:19.686 "data_size": 7936 00:24:19.686 } 00:24:19.686 ] 00:24:19.686 }' 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:19.686 17:36:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:19.946 [2024-07-15 17:36:31.039572] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:19.946 [2024-07-15 17:36:31.059580] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:19.946 [2024-07-15 17:36:31.059610] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:19.946 [2024-07-15 17:36:31.059620] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:19.946 [2024-07-15 17:36:31.059629] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.946 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:20.205 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:20.205 "name": "raid_bdev1", 00:24:20.205 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:20.205 "strip_size_kb": 0, 00:24:20.205 "state": "online", 00:24:20.205 "raid_level": "raid1", 00:24:20.205 "superblock": true, 00:24:20.205 "num_base_bdevs": 2, 00:24:20.205 "num_base_bdevs_discovered": 1, 00:24:20.205 "num_base_bdevs_operational": 1, 00:24:20.205 "base_bdevs_list": [ 00:24:20.205 { 00:24:20.205 "name": null, 00:24:20.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:20.205 "is_configured": false, 00:24:20.205 "data_offset": 256, 00:24:20.205 "data_size": 7936 00:24:20.205 }, 00:24:20.205 { 00:24:20.205 "name": "BaseBdev2", 00:24:20.205 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:20.205 "is_configured": true, 00:24:20.205 "data_offset": 256, 00:24:20.205 "data_size": 7936 00:24:20.205 } 00:24:20.205 ] 00:24:20.205 }' 00:24:20.205 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:20.205 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:20.775 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:20.775 [2024-07-15 17:36:31.969902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:20.775 [2024-07-15 17:36:31.969936] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.775 [2024-07-15 17:36:31.969951] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb16a0 00:24:20.775 [2024-07-15 17:36:31.969957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.775 [2024-07-15 17:36:31.970247] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.775 [2024-07-15 17:36:31.970258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:20.775 [2024-07-15 17:36:31.970315] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:20.775 [2024-07-15 17:36:31.970323] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:20.775 [2024-07-15 17:36:31.970328] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:20.775 [2024-07-15 17:36:31.970339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:20.775 [2024-07-15 17:36:31.973550] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d5c150 00:24:20.775 [2024-07-15 17:36:31.974687] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:20.775 spare 00:24:20.775 17:36:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:21.714 17:36:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:21.714 17:36:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:21.714 17:36:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:21.714 17:36:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:21.714 17:36:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:21.714 17:36:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.714 17:36:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.975 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:21.975 "name": "raid_bdev1", 00:24:21.975 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:21.975 "strip_size_kb": 0, 00:24:21.975 "state": "online", 00:24:21.975 "raid_level": "raid1", 00:24:21.975 "superblock": true, 00:24:21.975 "num_base_bdevs": 2, 00:24:21.975 "num_base_bdevs_discovered": 2, 00:24:21.975 "num_base_bdevs_operational": 2, 00:24:21.975 "process": { 00:24:21.975 "type": "rebuild", 00:24:21.975 "target": "spare", 00:24:21.975 "progress": { 00:24:21.975 "blocks": 2816, 00:24:21.975 "percent": 35 00:24:21.975 } 00:24:21.975 }, 00:24:21.975 "base_bdevs_list": [ 00:24:21.975 { 00:24:21.975 "name": "spare", 00:24:21.975 "uuid": "5b1ce5a4-c2fb-53bc-80c9-7e41e93463d0", 00:24:21.975 "is_configured": true, 00:24:21.975 "data_offset": 256, 00:24:21.975 "data_size": 7936 00:24:21.975 }, 00:24:21.975 { 00:24:21.975 "name": "BaseBdev2", 00:24:21.975 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:21.975 "is_configured": true, 00:24:21.975 "data_offset": 256, 00:24:21.975 "data_size": 7936 00:24:21.975 } 00:24:21.975 ] 00:24:21.975 }' 00:24:21.975 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:21.975 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:21.975 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:21.975 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:21.975 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:22.235 [2024-07-15 17:36:33.410998] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:22.235 [2024-07-15 17:36:33.483447] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:22.235 [2024-07-15 17:36:33.483479] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:22.235 [2024-07-15 17:36:33.483488] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:22.235 [2024-07-15 17:36:33.483493] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.235 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.495 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:22.495 "name": "raid_bdev1", 00:24:22.495 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:22.495 "strip_size_kb": 0, 00:24:22.495 "state": "online", 00:24:22.495 "raid_level": "raid1", 00:24:22.495 "superblock": true, 00:24:22.495 "num_base_bdevs": 2, 00:24:22.495 "num_base_bdevs_discovered": 1, 00:24:22.495 "num_base_bdevs_operational": 1, 00:24:22.495 "base_bdevs_list": [ 00:24:22.495 { 00:24:22.495 "name": null, 00:24:22.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.495 "is_configured": false, 00:24:22.495 "data_offset": 256, 00:24:22.495 "data_size": 7936 00:24:22.495 }, 00:24:22.495 { 00:24:22.495 "name": "BaseBdev2", 00:24:22.495 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:22.495 "is_configured": true, 00:24:22.495 "data_offset": 256, 00:24:22.495 "data_size": 7936 00:24:22.495 } 00:24:22.495 ] 00:24:22.495 }' 00:24:22.495 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:22.495 17:36:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:23.066 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:23.066 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:23.066 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:23.066 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:23.066 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:23.066 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.066 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.326 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:23.326 "name": "raid_bdev1", 00:24:23.326 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:23.326 "strip_size_kb": 0, 00:24:23.326 "state": "online", 00:24:23.326 "raid_level": "raid1", 00:24:23.326 "superblock": true, 00:24:23.326 "num_base_bdevs": 2, 00:24:23.326 "num_base_bdevs_discovered": 1, 00:24:23.326 "num_base_bdevs_operational": 1, 00:24:23.326 "base_bdevs_list": [ 00:24:23.326 { 00:24:23.326 "name": null, 00:24:23.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.326 "is_configured": false, 00:24:23.326 "data_offset": 256, 00:24:23.326 "data_size": 7936 00:24:23.326 }, 00:24:23.326 { 00:24:23.326 "name": "BaseBdev2", 00:24:23.326 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:23.326 "is_configured": true, 00:24:23.326 "data_offset": 256, 00:24:23.326 "data_size": 7936 00:24:23.326 } 00:24:23.326 ] 00:24:23.327 }' 00:24:23.327 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:23.327 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:23.327 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:23.327 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:23.327 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:23.592 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:23.852 [2024-07-15 17:36:34.910920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:23.852 [2024-07-15 17:36:34.910949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:23.852 [2024-07-15 17:36:34.910960] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bafcf0 00:24:23.852 [2024-07-15 17:36:34.910967] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:23.852 [2024-07-15 17:36:34.911234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:23.852 [2024-07-15 17:36:34.911246] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:23.852 [2024-07-15 17:36:34.911289] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:23.852 [2024-07-15 17:36:34.911295] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:23.852 [2024-07-15 17:36:34.911305] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:23.852 BaseBdev1 00:24:23.852 17:36:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.794 17:36:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.055 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:25.055 "name": "raid_bdev1", 00:24:25.055 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:25.055 "strip_size_kb": 0, 00:24:25.055 "state": "online", 00:24:25.055 "raid_level": "raid1", 00:24:25.055 "superblock": true, 00:24:25.055 "num_base_bdevs": 2, 00:24:25.055 "num_base_bdevs_discovered": 1, 00:24:25.055 "num_base_bdevs_operational": 1, 00:24:25.055 "base_bdevs_list": [ 00:24:25.055 { 00:24:25.055 "name": null, 00:24:25.055 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.055 "is_configured": false, 00:24:25.055 "data_offset": 256, 00:24:25.055 "data_size": 7936 00:24:25.055 }, 00:24:25.055 { 00:24:25.055 "name": "BaseBdev2", 00:24:25.055 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:25.055 "is_configured": true, 00:24:25.055 "data_offset": 256, 00:24:25.055 "data_size": 7936 00:24:25.055 } 00:24:25.055 ] 00:24:25.055 }' 00:24:25.055 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:25.055 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:25.626 "name": "raid_bdev1", 00:24:25.626 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:25.626 "strip_size_kb": 0, 00:24:25.626 "state": "online", 00:24:25.626 "raid_level": "raid1", 00:24:25.626 "superblock": true, 00:24:25.626 "num_base_bdevs": 2, 00:24:25.626 "num_base_bdevs_discovered": 1, 00:24:25.626 "num_base_bdevs_operational": 1, 00:24:25.626 "base_bdevs_list": [ 00:24:25.626 { 00:24:25.626 "name": null, 00:24:25.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.626 "is_configured": false, 00:24:25.626 "data_offset": 256, 00:24:25.626 "data_size": 7936 00:24:25.626 }, 00:24:25.626 { 00:24:25.626 "name": "BaseBdev2", 00:24:25.626 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:25.626 "is_configured": true, 00:24:25.626 "data_offset": 256, 00:24:25.626 "data_size": 7936 00:24:25.626 } 00:24:25.626 ] 00:24:25.626 }' 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:25.626 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.886 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:25.886 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:25.886 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:24:25.886 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:25.887 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.887 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:25.887 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.887 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:25.887 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.887 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:25.887 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.887 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:25.887 17:36:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:25.887 [2024-07-15 17:36:37.112492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:25.887 [2024-07-15 17:36:37.112580] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:25.887 [2024-07-15 17:36:37.112588] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:25.887 request: 00:24:25.887 { 00:24:25.887 "base_bdev": "BaseBdev1", 00:24:25.887 "raid_bdev": "raid_bdev1", 00:24:25.887 "method": "bdev_raid_add_base_bdev", 00:24:25.887 "req_id": 1 00:24:25.887 } 00:24:25.887 Got JSON-RPC error response 00:24:25.887 response: 00:24:25.887 { 00:24:25.887 "code": -22, 00:24:25.887 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:25.887 } 00:24:25.887 17:36:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:24:25.887 17:36:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:25.887 17:36:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:25.887 17:36:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:25.887 17:36:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:27.270 "name": "raid_bdev1", 00:24:27.270 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:27.270 "strip_size_kb": 0, 00:24:27.270 "state": "online", 00:24:27.270 "raid_level": "raid1", 00:24:27.270 "superblock": true, 00:24:27.270 "num_base_bdevs": 2, 00:24:27.270 "num_base_bdevs_discovered": 1, 00:24:27.270 "num_base_bdevs_operational": 1, 00:24:27.270 "base_bdevs_list": [ 00:24:27.270 { 00:24:27.270 "name": null, 00:24:27.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.270 "is_configured": false, 00:24:27.270 "data_offset": 256, 00:24:27.270 "data_size": 7936 00:24:27.270 }, 00:24:27.270 { 00:24:27.270 "name": "BaseBdev2", 00:24:27.270 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:27.270 "is_configured": true, 00:24:27.270 "data_offset": 256, 00:24:27.270 "data_size": 7936 00:24:27.270 } 00:24:27.270 ] 00:24:27.270 }' 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:27.270 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:27.841 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:27.841 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.841 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:27.841 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:27.841 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.841 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.841 17:36:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:27.841 "name": "raid_bdev1", 00:24:27.841 "uuid": "f84cbcd8-5ed3-40de-8917-9119d4885709", 00:24:27.841 "strip_size_kb": 0, 00:24:27.841 "state": "online", 00:24:27.841 "raid_level": "raid1", 00:24:27.841 "superblock": true, 00:24:27.841 "num_base_bdevs": 2, 00:24:27.841 "num_base_bdevs_discovered": 1, 00:24:27.841 "num_base_bdevs_operational": 1, 00:24:27.841 "base_bdevs_list": [ 00:24:27.841 { 00:24:27.841 "name": null, 00:24:27.841 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.841 "is_configured": false, 00:24:27.841 "data_offset": 256, 00:24:27.841 "data_size": 7936 00:24:27.841 }, 00:24:27.841 { 00:24:27.841 "name": "BaseBdev2", 00:24:27.841 "uuid": "0f340a4f-09de-5cc4-9cb4-2cdc9fc8bcf5", 00:24:27.841 "is_configured": true, 00:24:27.841 "data_offset": 256, 00:24:27.841 "data_size": 7936 00:24:27.841 } 00:24:27.841 ] 00:24:27.841 }' 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 2897167 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2897167 ']' 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2897167 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:27.841 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2897167 00:24:28.101 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:28.101 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:28.101 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2897167' 00:24:28.101 killing process with pid 2897167 00:24:28.101 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2897167 00:24:28.101 Received shutdown signal, test time was about 60.000000 seconds 00:24:28.101 00:24:28.101 Latency(us) 00:24:28.101 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:28.101 =================================================================================================================== 00:24:28.101 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:28.102 [2024-07-15 17:36:39.171436] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:28.102 [2024-07-15 17:36:39.171499] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:28.102 [2024-07-15 17:36:39.171528] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:28.102 [2024-07-15 17:36:39.171536] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d51330 name raid_bdev1, state offline 00:24:28.102 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2897167 00:24:28.102 [2024-07-15 17:36:39.186578] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:28.102 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:24:28.102 00:24:28.102 real 0m27.311s 00:24:28.102 user 0m42.869s 00:24:28.102 sys 0m3.368s 00:24:28.102 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:28.102 17:36:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:28.102 ************************************ 00:24:28.102 END TEST raid_rebuild_test_sb_4k 00:24:28.102 ************************************ 00:24:28.102 17:36:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:28.102 17:36:39 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:24:28.102 17:36:39 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:24:28.102 17:36:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:24:28.102 17:36:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:28.102 17:36:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:28.102 ************************************ 00:24:28.102 START TEST raid_state_function_test_sb_md_separate 00:24:28.102 ************************************ 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=2902199 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2902199' 00:24:28.102 Process raid pid: 2902199 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 2902199 /var/tmp/spdk-raid.sock 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2902199 ']' 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:28.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:28.102 17:36:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:28.363 [2024-07-15 17:36:39.451576] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:24:28.363 [2024-07-15 17:36:39.451646] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:28.363 [2024-07-15 17:36:39.543934] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:28.363 [2024-07-15 17:36:39.612348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:28.363 [2024-07-15 17:36:39.659821] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:28.363 [2024-07-15 17:36:39.659843] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:29.000 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:29.000 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:24:29.000 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:29.260 [2024-07-15 17:36:40.459188] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:29.260 [2024-07-15 17:36:40.459219] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:29.260 [2024-07-15 17:36:40.459228] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:29.260 [2024-07-15 17:36:40.459235] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.260 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:29.520 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:29.520 "name": "Existed_Raid", 00:24:29.520 "uuid": "edf877df-3b82-4f6c-becf-ce617385bee1", 00:24:29.521 "strip_size_kb": 0, 00:24:29.521 "state": "configuring", 00:24:29.521 "raid_level": "raid1", 00:24:29.521 "superblock": true, 00:24:29.521 "num_base_bdevs": 2, 00:24:29.521 "num_base_bdevs_discovered": 0, 00:24:29.521 "num_base_bdevs_operational": 2, 00:24:29.521 "base_bdevs_list": [ 00:24:29.521 { 00:24:29.521 "name": "BaseBdev1", 00:24:29.521 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:29.521 "is_configured": false, 00:24:29.521 "data_offset": 0, 00:24:29.521 "data_size": 0 00:24:29.521 }, 00:24:29.521 { 00:24:29.521 "name": "BaseBdev2", 00:24:29.521 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:29.521 "is_configured": false, 00:24:29.521 "data_offset": 0, 00:24:29.521 "data_size": 0 00:24:29.521 } 00:24:29.521 ] 00:24:29.521 }' 00:24:29.521 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:29.521 17:36:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:30.092 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:30.092 [2024-07-15 17:36:41.373372] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:30.092 [2024-07-15 17:36:41.373389] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaf26b0 name Existed_Raid, state configuring 00:24:30.092 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:30.353 [2024-07-15 17:36:41.549838] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:30.353 [2024-07-15 17:36:41.549854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:30.353 [2024-07-15 17:36:41.549859] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:30.353 [2024-07-15 17:36:41.549864] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:30.353 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:24:30.613 [2024-07-15 17:36:41.733303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:30.613 BaseBdev1 00:24:30.613 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:30.613 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:24:30.613 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:30.613 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:24:30.613 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:30.613 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:30.613 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:30.874 17:36:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:30.874 [ 00:24:30.874 { 00:24:30.874 "name": "BaseBdev1", 00:24:30.874 "aliases": [ 00:24:30.874 "adb378b5-601c-4a04-be07-f0837ad72caf" 00:24:30.874 ], 00:24:30.874 "product_name": "Malloc disk", 00:24:30.874 "block_size": 4096, 00:24:30.874 "num_blocks": 8192, 00:24:30.874 "uuid": "adb378b5-601c-4a04-be07-f0837ad72caf", 00:24:30.874 "md_size": 32, 00:24:30.874 "md_interleave": false, 00:24:30.874 "dif_type": 0, 00:24:30.874 "assigned_rate_limits": { 00:24:30.874 "rw_ios_per_sec": 0, 00:24:30.874 "rw_mbytes_per_sec": 0, 00:24:30.874 "r_mbytes_per_sec": 0, 00:24:30.874 "w_mbytes_per_sec": 0 00:24:30.874 }, 00:24:30.874 "claimed": true, 00:24:30.874 "claim_type": "exclusive_write", 00:24:30.874 "zoned": false, 00:24:30.874 "supported_io_types": { 00:24:30.874 "read": true, 00:24:30.874 "write": true, 00:24:30.874 "unmap": true, 00:24:30.874 "flush": true, 00:24:30.874 "reset": true, 00:24:30.874 "nvme_admin": false, 00:24:30.874 "nvme_io": false, 00:24:30.874 "nvme_io_md": false, 00:24:30.874 "write_zeroes": true, 00:24:30.874 "zcopy": true, 00:24:30.874 "get_zone_info": false, 00:24:30.874 "zone_management": false, 00:24:30.874 "zone_append": false, 00:24:30.874 "compare": false, 00:24:30.874 "compare_and_write": false, 00:24:30.874 "abort": true, 00:24:30.874 "seek_hole": false, 00:24:30.874 "seek_data": false, 00:24:30.874 "copy": true, 00:24:30.874 "nvme_iov_md": false 00:24:30.874 }, 00:24:30.874 "memory_domains": [ 00:24:30.874 { 00:24:30.874 "dma_device_id": "system", 00:24:30.874 "dma_device_type": 1 00:24:30.874 }, 00:24:30.874 { 00:24:30.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:30.874 "dma_device_type": 2 00:24:30.874 } 00:24:30.874 ], 00:24:30.874 "driver_specific": {} 00:24:30.874 } 00:24:30.874 ] 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.874 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:31.135 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:31.135 "name": "Existed_Raid", 00:24:31.135 "uuid": "e024a2d5-fbe0-44b6-b6d0-781b1a429aeb", 00:24:31.135 "strip_size_kb": 0, 00:24:31.135 "state": "configuring", 00:24:31.135 "raid_level": "raid1", 00:24:31.135 "superblock": true, 00:24:31.135 "num_base_bdevs": 2, 00:24:31.135 "num_base_bdevs_discovered": 1, 00:24:31.135 "num_base_bdevs_operational": 2, 00:24:31.135 "base_bdevs_list": [ 00:24:31.135 { 00:24:31.135 "name": "BaseBdev1", 00:24:31.135 "uuid": "adb378b5-601c-4a04-be07-f0837ad72caf", 00:24:31.135 "is_configured": true, 00:24:31.135 "data_offset": 256, 00:24:31.135 "data_size": 7936 00:24:31.135 }, 00:24:31.135 { 00:24:31.135 "name": "BaseBdev2", 00:24:31.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.135 "is_configured": false, 00:24:31.135 "data_offset": 0, 00:24:31.135 "data_size": 0 00:24:31.135 } 00:24:31.135 ] 00:24:31.135 }' 00:24:31.135 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:31.135 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:31.705 17:36:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:31.966 [2024-07-15 17:36:43.020582] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:31.966 [2024-07-15 17:36:43.020607] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaf1fa0 name Existed_Raid, state configuring 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:31.966 [2024-07-15 17:36:43.205072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:31.966 [2024-07-15 17:36:43.206159] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:31.966 [2024-07-15 17:36:43.206181] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.966 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:32.226 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:32.226 "name": "Existed_Raid", 00:24:32.226 "uuid": "1988cb47-5d9b-4153-bcc5-29315a88ef40", 00:24:32.226 "strip_size_kb": 0, 00:24:32.226 "state": "configuring", 00:24:32.226 "raid_level": "raid1", 00:24:32.226 "superblock": true, 00:24:32.227 "num_base_bdevs": 2, 00:24:32.227 "num_base_bdevs_discovered": 1, 00:24:32.227 "num_base_bdevs_operational": 2, 00:24:32.227 "base_bdevs_list": [ 00:24:32.227 { 00:24:32.227 "name": "BaseBdev1", 00:24:32.227 "uuid": "adb378b5-601c-4a04-be07-f0837ad72caf", 00:24:32.227 "is_configured": true, 00:24:32.227 "data_offset": 256, 00:24:32.227 "data_size": 7936 00:24:32.227 }, 00:24:32.227 { 00:24:32.227 "name": "BaseBdev2", 00:24:32.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:32.227 "is_configured": false, 00:24:32.227 "data_offset": 0, 00:24:32.227 "data_size": 0 00:24:32.227 } 00:24:32.227 ] 00:24:32.227 }' 00:24:32.227 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:32.227 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:32.796 17:36:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:24:33.056 [2024-07-15 17:36:44.156951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:33.056 [2024-07-15 17:36:44.157057] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaf3ed0 00:24:33.056 [2024-07-15 17:36:44.157065] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:33.056 [2024-07-15 17:36:44.157108] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaf2540 00:24:33.056 [2024-07-15 17:36:44.157183] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaf3ed0 00:24:33.056 [2024-07-15 17:36:44.157189] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xaf3ed0 00:24:33.056 [2024-07-15 17:36:44.157235] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.056 BaseBdev2 00:24:33.056 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:24:33.056 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:24:33.056 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:33.056 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:24:33.056 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:33.056 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:33.056 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:33.316 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:33.316 [ 00:24:33.316 { 00:24:33.316 "name": "BaseBdev2", 00:24:33.316 "aliases": [ 00:24:33.316 "fc5195bd-2cb0-436a-93ae-dff72463ac78" 00:24:33.316 ], 00:24:33.316 "product_name": "Malloc disk", 00:24:33.316 "block_size": 4096, 00:24:33.316 "num_blocks": 8192, 00:24:33.316 "uuid": "fc5195bd-2cb0-436a-93ae-dff72463ac78", 00:24:33.316 "md_size": 32, 00:24:33.316 "md_interleave": false, 00:24:33.316 "dif_type": 0, 00:24:33.316 "assigned_rate_limits": { 00:24:33.316 "rw_ios_per_sec": 0, 00:24:33.316 "rw_mbytes_per_sec": 0, 00:24:33.316 "r_mbytes_per_sec": 0, 00:24:33.316 "w_mbytes_per_sec": 0 00:24:33.316 }, 00:24:33.316 "claimed": true, 00:24:33.316 "claim_type": "exclusive_write", 00:24:33.316 "zoned": false, 00:24:33.316 "supported_io_types": { 00:24:33.316 "read": true, 00:24:33.316 "write": true, 00:24:33.317 "unmap": true, 00:24:33.317 "flush": true, 00:24:33.317 "reset": true, 00:24:33.317 "nvme_admin": false, 00:24:33.317 "nvme_io": false, 00:24:33.317 "nvme_io_md": false, 00:24:33.317 "write_zeroes": true, 00:24:33.317 "zcopy": true, 00:24:33.317 "get_zone_info": false, 00:24:33.317 "zone_management": false, 00:24:33.317 "zone_append": false, 00:24:33.317 "compare": false, 00:24:33.317 "compare_and_write": false, 00:24:33.317 "abort": true, 00:24:33.317 "seek_hole": false, 00:24:33.317 "seek_data": false, 00:24:33.317 "copy": true, 00:24:33.317 "nvme_iov_md": false 00:24:33.317 }, 00:24:33.317 "memory_domains": [ 00:24:33.317 { 00:24:33.317 "dma_device_id": "system", 00:24:33.317 "dma_device_type": 1 00:24:33.317 }, 00:24:33.317 { 00:24:33.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:33.317 "dma_device_type": 2 00:24:33.317 } 00:24:33.317 ], 00:24:33.317 "driver_specific": {} 00:24:33.317 } 00:24:33.317 ] 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.317 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:33.578 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.578 "name": "Existed_Raid", 00:24:33.578 "uuid": "1988cb47-5d9b-4153-bcc5-29315a88ef40", 00:24:33.578 "strip_size_kb": 0, 00:24:33.578 "state": "online", 00:24:33.578 "raid_level": "raid1", 00:24:33.578 "superblock": true, 00:24:33.578 "num_base_bdevs": 2, 00:24:33.578 "num_base_bdevs_discovered": 2, 00:24:33.578 "num_base_bdevs_operational": 2, 00:24:33.578 "base_bdevs_list": [ 00:24:33.578 { 00:24:33.578 "name": "BaseBdev1", 00:24:33.578 "uuid": "adb378b5-601c-4a04-be07-f0837ad72caf", 00:24:33.578 "is_configured": true, 00:24:33.578 "data_offset": 256, 00:24:33.578 "data_size": 7936 00:24:33.578 }, 00:24:33.578 { 00:24:33.578 "name": "BaseBdev2", 00:24:33.578 "uuid": "fc5195bd-2cb0-436a-93ae-dff72463ac78", 00:24:33.578 "is_configured": true, 00:24:33.578 "data_offset": 256, 00:24:33.578 "data_size": 7936 00:24:33.578 } 00:24:33.578 ] 00:24:33.578 }' 00:24:33.578 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.578 17:36:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:34.148 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:24:34.148 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:34.148 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:34.148 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:34.148 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:34.148 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:24:34.148 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:34.148 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:34.409 [2024-07-15 17:36:45.456467] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:34.409 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:34.409 "name": "Existed_Raid", 00:24:34.409 "aliases": [ 00:24:34.409 "1988cb47-5d9b-4153-bcc5-29315a88ef40" 00:24:34.409 ], 00:24:34.409 "product_name": "Raid Volume", 00:24:34.409 "block_size": 4096, 00:24:34.409 "num_blocks": 7936, 00:24:34.409 "uuid": "1988cb47-5d9b-4153-bcc5-29315a88ef40", 00:24:34.409 "md_size": 32, 00:24:34.409 "md_interleave": false, 00:24:34.409 "dif_type": 0, 00:24:34.409 "assigned_rate_limits": { 00:24:34.409 "rw_ios_per_sec": 0, 00:24:34.409 "rw_mbytes_per_sec": 0, 00:24:34.409 "r_mbytes_per_sec": 0, 00:24:34.409 "w_mbytes_per_sec": 0 00:24:34.409 }, 00:24:34.409 "claimed": false, 00:24:34.409 "zoned": false, 00:24:34.409 "supported_io_types": { 00:24:34.409 "read": true, 00:24:34.409 "write": true, 00:24:34.409 "unmap": false, 00:24:34.409 "flush": false, 00:24:34.409 "reset": true, 00:24:34.409 "nvme_admin": false, 00:24:34.409 "nvme_io": false, 00:24:34.409 "nvme_io_md": false, 00:24:34.409 "write_zeroes": true, 00:24:34.409 "zcopy": false, 00:24:34.409 "get_zone_info": false, 00:24:34.409 "zone_management": false, 00:24:34.409 "zone_append": false, 00:24:34.409 "compare": false, 00:24:34.409 "compare_and_write": false, 00:24:34.409 "abort": false, 00:24:34.409 "seek_hole": false, 00:24:34.409 "seek_data": false, 00:24:34.409 "copy": false, 00:24:34.409 "nvme_iov_md": false 00:24:34.409 }, 00:24:34.409 "memory_domains": [ 00:24:34.409 { 00:24:34.409 "dma_device_id": "system", 00:24:34.409 "dma_device_type": 1 00:24:34.409 }, 00:24:34.409 { 00:24:34.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:34.409 "dma_device_type": 2 00:24:34.409 }, 00:24:34.409 { 00:24:34.409 "dma_device_id": "system", 00:24:34.409 "dma_device_type": 1 00:24:34.409 }, 00:24:34.409 { 00:24:34.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:34.409 "dma_device_type": 2 00:24:34.409 } 00:24:34.409 ], 00:24:34.409 "driver_specific": { 00:24:34.409 "raid": { 00:24:34.409 "uuid": "1988cb47-5d9b-4153-bcc5-29315a88ef40", 00:24:34.409 "strip_size_kb": 0, 00:24:34.409 "state": "online", 00:24:34.409 "raid_level": "raid1", 00:24:34.409 "superblock": true, 00:24:34.409 "num_base_bdevs": 2, 00:24:34.409 "num_base_bdevs_discovered": 2, 00:24:34.409 "num_base_bdevs_operational": 2, 00:24:34.409 "base_bdevs_list": [ 00:24:34.409 { 00:24:34.409 "name": "BaseBdev1", 00:24:34.409 "uuid": "adb378b5-601c-4a04-be07-f0837ad72caf", 00:24:34.409 "is_configured": true, 00:24:34.409 "data_offset": 256, 00:24:34.409 "data_size": 7936 00:24:34.409 }, 00:24:34.409 { 00:24:34.409 "name": "BaseBdev2", 00:24:34.409 "uuid": "fc5195bd-2cb0-436a-93ae-dff72463ac78", 00:24:34.409 "is_configured": true, 00:24:34.409 "data_offset": 256, 00:24:34.409 "data_size": 7936 00:24:34.409 } 00:24:34.409 ] 00:24:34.409 } 00:24:34.409 } 00:24:34.409 }' 00:24:34.409 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:34.409 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:24:34.409 BaseBdev2' 00:24:34.409 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:34.409 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:34.409 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:34.409 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:34.409 "name": "BaseBdev1", 00:24:34.409 "aliases": [ 00:24:34.409 "adb378b5-601c-4a04-be07-f0837ad72caf" 00:24:34.409 ], 00:24:34.409 "product_name": "Malloc disk", 00:24:34.409 "block_size": 4096, 00:24:34.409 "num_blocks": 8192, 00:24:34.409 "uuid": "adb378b5-601c-4a04-be07-f0837ad72caf", 00:24:34.409 "md_size": 32, 00:24:34.409 "md_interleave": false, 00:24:34.409 "dif_type": 0, 00:24:34.409 "assigned_rate_limits": { 00:24:34.409 "rw_ios_per_sec": 0, 00:24:34.409 "rw_mbytes_per_sec": 0, 00:24:34.409 "r_mbytes_per_sec": 0, 00:24:34.409 "w_mbytes_per_sec": 0 00:24:34.409 }, 00:24:34.409 "claimed": true, 00:24:34.409 "claim_type": "exclusive_write", 00:24:34.409 "zoned": false, 00:24:34.409 "supported_io_types": { 00:24:34.409 "read": true, 00:24:34.409 "write": true, 00:24:34.409 "unmap": true, 00:24:34.409 "flush": true, 00:24:34.409 "reset": true, 00:24:34.409 "nvme_admin": false, 00:24:34.409 "nvme_io": false, 00:24:34.409 "nvme_io_md": false, 00:24:34.409 "write_zeroes": true, 00:24:34.409 "zcopy": true, 00:24:34.409 "get_zone_info": false, 00:24:34.409 "zone_management": false, 00:24:34.409 "zone_append": false, 00:24:34.409 "compare": false, 00:24:34.409 "compare_and_write": false, 00:24:34.409 "abort": true, 00:24:34.409 "seek_hole": false, 00:24:34.409 "seek_data": false, 00:24:34.409 "copy": true, 00:24:34.409 "nvme_iov_md": false 00:24:34.409 }, 00:24:34.409 "memory_domains": [ 00:24:34.409 { 00:24:34.409 "dma_device_id": "system", 00:24:34.409 "dma_device_type": 1 00:24:34.409 }, 00:24:34.409 { 00:24:34.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:34.409 "dma_device_type": 2 00:24:34.409 } 00:24:34.409 ], 00:24:34.409 "driver_specific": {} 00:24:34.409 }' 00:24:34.409 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:34.670 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:34.670 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:34.670 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:34.670 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:34.670 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:34.670 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:34.670 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:34.930 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:34.930 17:36:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:34.930 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:34.930 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:34.930 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:34.930 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:34.930 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:35.190 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:35.190 "name": "BaseBdev2", 00:24:35.190 "aliases": [ 00:24:35.190 "fc5195bd-2cb0-436a-93ae-dff72463ac78" 00:24:35.190 ], 00:24:35.190 "product_name": "Malloc disk", 00:24:35.190 "block_size": 4096, 00:24:35.190 "num_blocks": 8192, 00:24:35.190 "uuid": "fc5195bd-2cb0-436a-93ae-dff72463ac78", 00:24:35.190 "md_size": 32, 00:24:35.190 "md_interleave": false, 00:24:35.190 "dif_type": 0, 00:24:35.190 "assigned_rate_limits": { 00:24:35.190 "rw_ios_per_sec": 0, 00:24:35.190 "rw_mbytes_per_sec": 0, 00:24:35.190 "r_mbytes_per_sec": 0, 00:24:35.190 "w_mbytes_per_sec": 0 00:24:35.190 }, 00:24:35.190 "claimed": true, 00:24:35.190 "claim_type": "exclusive_write", 00:24:35.190 "zoned": false, 00:24:35.190 "supported_io_types": { 00:24:35.190 "read": true, 00:24:35.190 "write": true, 00:24:35.190 "unmap": true, 00:24:35.190 "flush": true, 00:24:35.190 "reset": true, 00:24:35.190 "nvme_admin": false, 00:24:35.190 "nvme_io": false, 00:24:35.190 "nvme_io_md": false, 00:24:35.190 "write_zeroes": true, 00:24:35.190 "zcopy": true, 00:24:35.190 "get_zone_info": false, 00:24:35.190 "zone_management": false, 00:24:35.190 "zone_append": false, 00:24:35.190 "compare": false, 00:24:35.190 "compare_and_write": false, 00:24:35.190 "abort": true, 00:24:35.190 "seek_hole": false, 00:24:35.190 "seek_data": false, 00:24:35.190 "copy": true, 00:24:35.190 "nvme_iov_md": false 00:24:35.190 }, 00:24:35.190 "memory_domains": [ 00:24:35.190 { 00:24:35.190 "dma_device_id": "system", 00:24:35.190 "dma_device_type": 1 00:24:35.190 }, 00:24:35.190 { 00:24:35.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:35.190 "dma_device_type": 2 00:24:35.190 } 00:24:35.190 ], 00:24:35.190 "driver_specific": {} 00:24:35.190 }' 00:24:35.190 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:35.190 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:35.190 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:35.190 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:35.190 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:35.190 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:35.190 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:35.190 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:35.451 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:35.451 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:35.451 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:35.451 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:35.451 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:35.711 [2024-07-15 17:36:46.783641] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:35.711 "name": "Existed_Raid", 00:24:35.711 "uuid": "1988cb47-5d9b-4153-bcc5-29315a88ef40", 00:24:35.711 "strip_size_kb": 0, 00:24:35.711 "state": "online", 00:24:35.711 "raid_level": "raid1", 00:24:35.711 "superblock": true, 00:24:35.711 "num_base_bdevs": 2, 00:24:35.711 "num_base_bdevs_discovered": 1, 00:24:35.711 "num_base_bdevs_operational": 1, 00:24:35.711 "base_bdevs_list": [ 00:24:35.711 { 00:24:35.711 "name": null, 00:24:35.711 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:35.711 "is_configured": false, 00:24:35.711 "data_offset": 256, 00:24:35.711 "data_size": 7936 00:24:35.711 }, 00:24:35.711 { 00:24:35.711 "name": "BaseBdev2", 00:24:35.711 "uuid": "fc5195bd-2cb0-436a-93ae-dff72463ac78", 00:24:35.711 "is_configured": true, 00:24:35.711 "data_offset": 256, 00:24:35.711 "data_size": 7936 00:24:35.711 } 00:24:35.711 ] 00:24:35.711 }' 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:35.711 17:36:46 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:36.282 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:24:36.282 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:36.282 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.282 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:36.541 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:36.541 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:36.541 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:36.801 [2024-07-15 17:36:47.908377] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:36.801 [2024-07-15 17:36:47.908436] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:36.801 [2024-07-15 17:36:47.914842] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:36.801 [2024-07-15 17:36:47.914866] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:36.801 [2024-07-15 17:36:47.914872] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaf3ed0 name Existed_Raid, state offline 00:24:36.801 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:36.801 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:36.801 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.801 17:36:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:24:37.061 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:24:37.061 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:24:37.061 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:24:37.061 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 2902199 00:24:37.061 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2902199 ']' 00:24:37.061 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2902199 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2902199 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2902199' 00:24:37.062 killing process with pid 2902199 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2902199 00:24:37.062 [2024-07-15 17:36:48.179994] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2902199 00:24:37.062 [2024-07-15 17:36:48.180554] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:24:37.062 00:24:37.062 real 0m8.918s 00:24:37.062 user 0m16.225s 00:24:37.062 sys 0m1.343s 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:37.062 17:36:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:37.062 ************************************ 00:24:37.062 END TEST raid_state_function_test_sb_md_separate 00:24:37.062 ************************************ 00:24:37.062 17:36:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:37.062 17:36:48 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:24:37.062 17:36:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:24:37.062 17:36:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:37.062 17:36:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:37.322 ************************************ 00:24:37.322 START TEST raid_superblock_test_md_separate 00:24:37.322 ************************************ 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=2903949 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 2903949 /var/tmp/spdk-raid.sock 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2903949 ']' 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:37.322 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:37.322 17:36:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:37.322 [2024-07-15 17:36:48.445867] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:24:37.322 [2024-07-15 17:36:48.445933] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2903949 ] 00:24:37.322 [2024-07-15 17:36:48.539422] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:37.322 [2024-07-15 17:36:48.613489] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:37.581 [2024-07-15 17:36:48.652817] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:37.581 [2024-07-15 17:36:48.652843] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:38.150 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:24:38.409 malloc1 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:38.409 [2024-07-15 17:36:49.644443] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:38.409 [2024-07-15 17:36:49.644481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.409 [2024-07-15 17:36:49.644493] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x91e3c0 00:24:38.409 [2024-07-15 17:36:49.644500] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.409 [2024-07-15 17:36:49.645674] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.409 [2024-07-15 17:36:49.645693] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:38.409 pt1 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:38.409 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:24:38.668 malloc2 00:24:38.668 17:36:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:38.929 [2024-07-15 17:36:50.027581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:38.929 [2024-07-15 17:36:50.027614] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.929 [2024-07-15 17:36:50.027624] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaaaaf0 00:24:38.929 [2024-07-15 17:36:50.027630] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.929 [2024-07-15 17:36:50.028786] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.929 [2024-07-15 17:36:50.028804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:38.929 pt2 00:24:38.929 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:38.929 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:38.929 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:24:38.929 [2024-07-15 17:36:50.224086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:38.929 [2024-07-15 17:36:50.225069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:38.929 [2024-07-15 17:36:50.225180] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa9eee0 00:24:38.929 [2024-07-15 17:36:50.225188] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:38.929 [2024-07-15 17:36:50.225233] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa9ede0 00:24:38.929 [2024-07-15 17:36:50.225318] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa9eee0 00:24:38.929 [2024-07-15 17:36:50.225324] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa9eee0 00:24:38.929 [2024-07-15 17:36:50.225372] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.189 "name": "raid_bdev1", 00:24:39.189 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:39.189 "strip_size_kb": 0, 00:24:39.189 "state": "online", 00:24:39.189 "raid_level": "raid1", 00:24:39.189 "superblock": true, 00:24:39.189 "num_base_bdevs": 2, 00:24:39.189 "num_base_bdevs_discovered": 2, 00:24:39.189 "num_base_bdevs_operational": 2, 00:24:39.189 "base_bdevs_list": [ 00:24:39.189 { 00:24:39.189 "name": "pt1", 00:24:39.189 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:39.189 "is_configured": true, 00:24:39.189 "data_offset": 256, 00:24:39.189 "data_size": 7936 00:24:39.189 }, 00:24:39.189 { 00:24:39.189 "name": "pt2", 00:24:39.189 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:39.189 "is_configured": true, 00:24:39.189 "data_offset": 256, 00:24:39.189 "data_size": 7936 00:24:39.189 } 00:24:39.189 ] 00:24:39.189 }' 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.189 17:36:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:39.759 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:24:39.759 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:39.759 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:39.759 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:39.759 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:39.759 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:24:39.759 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:39.759 17:36:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:40.019 [2024-07-15 17:36:51.142604] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:40.019 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:40.019 "name": "raid_bdev1", 00:24:40.019 "aliases": [ 00:24:40.019 "9ec03c56-25db-4ebc-8a04-871ba5ca31b0" 00:24:40.019 ], 00:24:40.019 "product_name": "Raid Volume", 00:24:40.019 "block_size": 4096, 00:24:40.019 "num_blocks": 7936, 00:24:40.019 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:40.019 "md_size": 32, 00:24:40.019 "md_interleave": false, 00:24:40.019 "dif_type": 0, 00:24:40.019 "assigned_rate_limits": { 00:24:40.019 "rw_ios_per_sec": 0, 00:24:40.019 "rw_mbytes_per_sec": 0, 00:24:40.019 "r_mbytes_per_sec": 0, 00:24:40.019 "w_mbytes_per_sec": 0 00:24:40.019 }, 00:24:40.019 "claimed": false, 00:24:40.019 "zoned": false, 00:24:40.019 "supported_io_types": { 00:24:40.019 "read": true, 00:24:40.019 "write": true, 00:24:40.019 "unmap": false, 00:24:40.019 "flush": false, 00:24:40.019 "reset": true, 00:24:40.019 "nvme_admin": false, 00:24:40.019 "nvme_io": false, 00:24:40.019 "nvme_io_md": false, 00:24:40.019 "write_zeroes": true, 00:24:40.019 "zcopy": false, 00:24:40.019 "get_zone_info": false, 00:24:40.019 "zone_management": false, 00:24:40.019 "zone_append": false, 00:24:40.019 "compare": false, 00:24:40.019 "compare_and_write": false, 00:24:40.019 "abort": false, 00:24:40.019 "seek_hole": false, 00:24:40.019 "seek_data": false, 00:24:40.019 "copy": false, 00:24:40.019 "nvme_iov_md": false 00:24:40.019 }, 00:24:40.019 "memory_domains": [ 00:24:40.019 { 00:24:40.019 "dma_device_id": "system", 00:24:40.019 "dma_device_type": 1 00:24:40.019 }, 00:24:40.019 { 00:24:40.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:40.019 "dma_device_type": 2 00:24:40.019 }, 00:24:40.019 { 00:24:40.019 "dma_device_id": "system", 00:24:40.019 "dma_device_type": 1 00:24:40.019 }, 00:24:40.019 { 00:24:40.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:40.019 "dma_device_type": 2 00:24:40.019 } 00:24:40.019 ], 00:24:40.019 "driver_specific": { 00:24:40.019 "raid": { 00:24:40.019 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:40.019 "strip_size_kb": 0, 00:24:40.019 "state": "online", 00:24:40.019 "raid_level": "raid1", 00:24:40.019 "superblock": true, 00:24:40.019 "num_base_bdevs": 2, 00:24:40.019 "num_base_bdevs_discovered": 2, 00:24:40.019 "num_base_bdevs_operational": 2, 00:24:40.019 "base_bdevs_list": [ 00:24:40.019 { 00:24:40.019 "name": "pt1", 00:24:40.019 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:40.019 "is_configured": true, 00:24:40.019 "data_offset": 256, 00:24:40.019 "data_size": 7936 00:24:40.019 }, 00:24:40.019 { 00:24:40.019 "name": "pt2", 00:24:40.019 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:40.019 "is_configured": true, 00:24:40.019 "data_offset": 256, 00:24:40.019 "data_size": 7936 00:24:40.019 } 00:24:40.019 ] 00:24:40.019 } 00:24:40.019 } 00:24:40.019 }' 00:24:40.019 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:40.019 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:40.019 pt2' 00:24:40.019 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:40.019 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:40.019 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:40.279 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:40.279 "name": "pt1", 00:24:40.279 "aliases": [ 00:24:40.279 "00000000-0000-0000-0000-000000000001" 00:24:40.279 ], 00:24:40.279 "product_name": "passthru", 00:24:40.279 "block_size": 4096, 00:24:40.279 "num_blocks": 8192, 00:24:40.279 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:40.279 "md_size": 32, 00:24:40.279 "md_interleave": false, 00:24:40.279 "dif_type": 0, 00:24:40.279 "assigned_rate_limits": { 00:24:40.279 "rw_ios_per_sec": 0, 00:24:40.279 "rw_mbytes_per_sec": 0, 00:24:40.279 "r_mbytes_per_sec": 0, 00:24:40.279 "w_mbytes_per_sec": 0 00:24:40.279 }, 00:24:40.279 "claimed": true, 00:24:40.279 "claim_type": "exclusive_write", 00:24:40.279 "zoned": false, 00:24:40.279 "supported_io_types": { 00:24:40.279 "read": true, 00:24:40.279 "write": true, 00:24:40.279 "unmap": true, 00:24:40.279 "flush": true, 00:24:40.279 "reset": true, 00:24:40.279 "nvme_admin": false, 00:24:40.279 "nvme_io": false, 00:24:40.279 "nvme_io_md": false, 00:24:40.279 "write_zeroes": true, 00:24:40.279 "zcopy": true, 00:24:40.279 "get_zone_info": false, 00:24:40.279 "zone_management": false, 00:24:40.279 "zone_append": false, 00:24:40.279 "compare": false, 00:24:40.279 "compare_and_write": false, 00:24:40.279 "abort": true, 00:24:40.279 "seek_hole": false, 00:24:40.279 "seek_data": false, 00:24:40.279 "copy": true, 00:24:40.279 "nvme_iov_md": false 00:24:40.279 }, 00:24:40.279 "memory_domains": [ 00:24:40.279 { 00:24:40.279 "dma_device_id": "system", 00:24:40.279 "dma_device_type": 1 00:24:40.279 }, 00:24:40.279 { 00:24:40.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:40.279 "dma_device_type": 2 00:24:40.279 } 00:24:40.279 ], 00:24:40.279 "driver_specific": { 00:24:40.279 "passthru": { 00:24:40.279 "name": "pt1", 00:24:40.279 "base_bdev_name": "malloc1" 00:24:40.279 } 00:24:40.279 } 00:24:40.279 }' 00:24:40.279 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:40.279 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:40.279 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:40.279 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:40.279 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:40.539 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:40.799 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:40.799 "name": "pt2", 00:24:40.799 "aliases": [ 00:24:40.799 "00000000-0000-0000-0000-000000000002" 00:24:40.799 ], 00:24:40.799 "product_name": "passthru", 00:24:40.799 "block_size": 4096, 00:24:40.799 "num_blocks": 8192, 00:24:40.799 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:40.799 "md_size": 32, 00:24:40.799 "md_interleave": false, 00:24:40.799 "dif_type": 0, 00:24:40.799 "assigned_rate_limits": { 00:24:40.799 "rw_ios_per_sec": 0, 00:24:40.799 "rw_mbytes_per_sec": 0, 00:24:40.799 "r_mbytes_per_sec": 0, 00:24:40.799 "w_mbytes_per_sec": 0 00:24:40.799 }, 00:24:40.799 "claimed": true, 00:24:40.799 "claim_type": "exclusive_write", 00:24:40.799 "zoned": false, 00:24:40.799 "supported_io_types": { 00:24:40.799 "read": true, 00:24:40.799 "write": true, 00:24:40.799 "unmap": true, 00:24:40.799 "flush": true, 00:24:40.799 "reset": true, 00:24:40.799 "nvme_admin": false, 00:24:40.799 "nvme_io": false, 00:24:40.799 "nvme_io_md": false, 00:24:40.799 "write_zeroes": true, 00:24:40.799 "zcopy": true, 00:24:40.799 "get_zone_info": false, 00:24:40.799 "zone_management": false, 00:24:40.799 "zone_append": false, 00:24:40.799 "compare": false, 00:24:40.799 "compare_and_write": false, 00:24:40.799 "abort": true, 00:24:40.799 "seek_hole": false, 00:24:40.799 "seek_data": false, 00:24:40.799 "copy": true, 00:24:40.799 "nvme_iov_md": false 00:24:40.799 }, 00:24:40.799 "memory_domains": [ 00:24:40.799 { 00:24:40.799 "dma_device_id": "system", 00:24:40.799 "dma_device_type": 1 00:24:40.799 }, 00:24:40.799 { 00:24:40.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:40.799 "dma_device_type": 2 00:24:40.799 } 00:24:40.799 ], 00:24:40.799 "driver_specific": { 00:24:40.799 "passthru": { 00:24:40.799 "name": "pt2", 00:24:40.799 "base_bdev_name": "malloc2" 00:24:40.799 } 00:24:40.799 } 00:24:40.799 }' 00:24:40.799 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:40.799 17:36:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:40.799 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:40.799 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:40.799 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:40.799 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:40.799 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:41.059 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:41.059 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:41.059 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:41.059 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:41.059 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:41.059 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:41.059 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:24:41.318 [2024-07-15 17:36:52.453927] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:41.318 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9ec03c56-25db-4ebc-8a04-871ba5ca31b0 00:24:41.318 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 9ec03c56-25db-4ebc-8a04-871ba5ca31b0 ']' 00:24:41.318 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:41.579 [2024-07-15 17:36:52.642199] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:41.579 [2024-07-15 17:36:52.642211] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:41.579 [2024-07-15 17:36:52.642249] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:41.579 [2024-07-15 17:36:52.642287] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:41.579 [2024-07-15 17:36:52.642293] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa9eee0 name raid_bdev1, state offline 00:24:41.579 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.579 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:24:41.579 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:24:41.579 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:24:41.579 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:41.579 17:36:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:41.839 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:41.839 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:42.099 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:24:42.099 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:42.358 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:42.358 [2024-07-15 17:36:53.596588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:24:42.358 [2024-07-15 17:36:53.597651] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:24:42.358 [2024-07-15 17:36:53.597695] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:24:42.358 [2024-07-15 17:36:53.597728] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:24:42.359 [2024-07-15 17:36:53.597739] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:42.359 [2024-07-15 17:36:53.597745] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x91dd60 name raid_bdev1, state configuring 00:24:42.359 request: 00:24:42.359 { 00:24:42.359 "name": "raid_bdev1", 00:24:42.359 "raid_level": "raid1", 00:24:42.359 "base_bdevs": [ 00:24:42.359 "malloc1", 00:24:42.359 "malloc2" 00:24:42.359 ], 00:24:42.359 "superblock": false, 00:24:42.359 "method": "bdev_raid_create", 00:24:42.359 "req_id": 1 00:24:42.359 } 00:24:42.359 Got JSON-RPC error response 00:24:42.359 response: 00:24:42.359 { 00:24:42.359 "code": -17, 00:24:42.359 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:24:42.359 } 00:24:42.359 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:24:42.359 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:42.359 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:42.359 17:36:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:42.359 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.359 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:24:42.638 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:24:42.638 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:24:42.638 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:42.931 [2024-07-15 17:36:53.981523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:42.931 [2024-07-15 17:36:53.981548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:42.931 [2024-07-15 17:36:53.981558] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaa1150 00:24:42.931 [2024-07-15 17:36:53.981568] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:42.931 [2024-07-15 17:36:53.982700] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:42.931 [2024-07-15 17:36:53.982725] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:42.931 [2024-07-15 17:36:53.982753] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:42.931 [2024-07-15 17:36:53.982772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:42.931 pt1 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.931 17:36:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.931 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:42.931 "name": "raid_bdev1", 00:24:42.931 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:42.931 "strip_size_kb": 0, 00:24:42.931 "state": "configuring", 00:24:42.931 "raid_level": "raid1", 00:24:42.931 "superblock": true, 00:24:42.931 "num_base_bdevs": 2, 00:24:42.931 "num_base_bdevs_discovered": 1, 00:24:42.931 "num_base_bdevs_operational": 2, 00:24:42.931 "base_bdevs_list": [ 00:24:42.931 { 00:24:42.931 "name": "pt1", 00:24:42.931 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:42.931 "is_configured": true, 00:24:42.931 "data_offset": 256, 00:24:42.931 "data_size": 7936 00:24:42.931 }, 00:24:42.931 { 00:24:42.931 "name": null, 00:24:42.931 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:42.931 "is_configured": false, 00:24:42.931 "data_offset": 256, 00:24:42.931 "data_size": 7936 00:24:42.931 } 00:24:42.931 ] 00:24:42.931 }' 00:24:42.931 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:42.931 17:36:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:43.501 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:24:43.501 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:24:43.501 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:43.501 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:43.762 [2024-07-15 17:36:54.915893] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:43.762 [2024-07-15 17:36:54.915924] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:43.762 [2024-07-15 17:36:54.915936] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaa17a0 00:24:43.762 [2024-07-15 17:36:54.915943] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:43.762 [2024-07-15 17:36:54.916082] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:43.762 [2024-07-15 17:36:54.916090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:43.762 [2024-07-15 17:36:54.916120] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:43.762 [2024-07-15 17:36:54.916132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:43.762 [2024-07-15 17:36:54.916202] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaa1df0 00:24:43.762 [2024-07-15 17:36:54.916208] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:43.762 [2024-07-15 17:36:54.916249] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaa3be0 00:24:43.762 [2024-07-15 17:36:54.916327] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaa1df0 00:24:43.762 [2024-07-15 17:36:54.916332] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xaa1df0 00:24:43.762 [2024-07-15 17:36:54.916382] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:43.762 pt2 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.762 17:36:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.022 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.022 "name": "raid_bdev1", 00:24:44.022 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:44.022 "strip_size_kb": 0, 00:24:44.022 "state": "online", 00:24:44.022 "raid_level": "raid1", 00:24:44.022 "superblock": true, 00:24:44.022 "num_base_bdevs": 2, 00:24:44.022 "num_base_bdevs_discovered": 2, 00:24:44.022 "num_base_bdevs_operational": 2, 00:24:44.022 "base_bdevs_list": [ 00:24:44.022 { 00:24:44.022 "name": "pt1", 00:24:44.022 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:44.022 "is_configured": true, 00:24:44.022 "data_offset": 256, 00:24:44.022 "data_size": 7936 00:24:44.022 }, 00:24:44.022 { 00:24:44.022 "name": "pt2", 00:24:44.022 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:44.022 "is_configured": true, 00:24:44.022 "data_offset": 256, 00:24:44.022 "data_size": 7936 00:24:44.022 } 00:24:44.022 ] 00:24:44.022 }' 00:24:44.022 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.022 17:36:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:44.591 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:24:44.591 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:44.591 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:44.591 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:44.591 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:44.591 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:24:44.591 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:44.591 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:44.591 [2024-07-15 17:36:55.862500] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:44.591 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:44.591 "name": "raid_bdev1", 00:24:44.591 "aliases": [ 00:24:44.591 "9ec03c56-25db-4ebc-8a04-871ba5ca31b0" 00:24:44.591 ], 00:24:44.591 "product_name": "Raid Volume", 00:24:44.591 "block_size": 4096, 00:24:44.591 "num_blocks": 7936, 00:24:44.591 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:44.591 "md_size": 32, 00:24:44.591 "md_interleave": false, 00:24:44.591 "dif_type": 0, 00:24:44.591 "assigned_rate_limits": { 00:24:44.591 "rw_ios_per_sec": 0, 00:24:44.591 "rw_mbytes_per_sec": 0, 00:24:44.591 "r_mbytes_per_sec": 0, 00:24:44.591 "w_mbytes_per_sec": 0 00:24:44.591 }, 00:24:44.591 "claimed": false, 00:24:44.591 "zoned": false, 00:24:44.591 "supported_io_types": { 00:24:44.591 "read": true, 00:24:44.591 "write": true, 00:24:44.591 "unmap": false, 00:24:44.591 "flush": false, 00:24:44.591 "reset": true, 00:24:44.591 "nvme_admin": false, 00:24:44.591 "nvme_io": false, 00:24:44.591 "nvme_io_md": false, 00:24:44.591 "write_zeroes": true, 00:24:44.591 "zcopy": false, 00:24:44.592 "get_zone_info": false, 00:24:44.592 "zone_management": false, 00:24:44.592 "zone_append": false, 00:24:44.592 "compare": false, 00:24:44.592 "compare_and_write": false, 00:24:44.592 "abort": false, 00:24:44.592 "seek_hole": false, 00:24:44.592 "seek_data": false, 00:24:44.592 "copy": false, 00:24:44.592 "nvme_iov_md": false 00:24:44.592 }, 00:24:44.592 "memory_domains": [ 00:24:44.592 { 00:24:44.592 "dma_device_id": "system", 00:24:44.592 "dma_device_type": 1 00:24:44.592 }, 00:24:44.592 { 00:24:44.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:44.592 "dma_device_type": 2 00:24:44.592 }, 00:24:44.592 { 00:24:44.592 "dma_device_id": "system", 00:24:44.592 "dma_device_type": 1 00:24:44.592 }, 00:24:44.592 { 00:24:44.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:44.592 "dma_device_type": 2 00:24:44.592 } 00:24:44.592 ], 00:24:44.592 "driver_specific": { 00:24:44.592 "raid": { 00:24:44.592 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:44.592 "strip_size_kb": 0, 00:24:44.592 "state": "online", 00:24:44.592 "raid_level": "raid1", 00:24:44.592 "superblock": true, 00:24:44.592 "num_base_bdevs": 2, 00:24:44.592 "num_base_bdevs_discovered": 2, 00:24:44.592 "num_base_bdevs_operational": 2, 00:24:44.592 "base_bdevs_list": [ 00:24:44.592 { 00:24:44.592 "name": "pt1", 00:24:44.592 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:44.592 "is_configured": true, 00:24:44.592 "data_offset": 256, 00:24:44.592 "data_size": 7936 00:24:44.592 }, 00:24:44.592 { 00:24:44.592 "name": "pt2", 00:24:44.592 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:44.592 "is_configured": true, 00:24:44.592 "data_offset": 256, 00:24:44.592 "data_size": 7936 00:24:44.592 } 00:24:44.592 ] 00:24:44.592 } 00:24:44.592 } 00:24:44.592 }' 00:24:44.592 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:44.851 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:44.851 pt2' 00:24:44.851 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:44.851 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:44.851 17:36:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:44.851 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:44.851 "name": "pt1", 00:24:44.851 "aliases": [ 00:24:44.851 "00000000-0000-0000-0000-000000000001" 00:24:44.851 ], 00:24:44.851 "product_name": "passthru", 00:24:44.851 "block_size": 4096, 00:24:44.851 "num_blocks": 8192, 00:24:44.851 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:44.851 "md_size": 32, 00:24:44.851 "md_interleave": false, 00:24:44.851 "dif_type": 0, 00:24:44.851 "assigned_rate_limits": { 00:24:44.851 "rw_ios_per_sec": 0, 00:24:44.851 "rw_mbytes_per_sec": 0, 00:24:44.851 "r_mbytes_per_sec": 0, 00:24:44.851 "w_mbytes_per_sec": 0 00:24:44.851 }, 00:24:44.851 "claimed": true, 00:24:44.851 "claim_type": "exclusive_write", 00:24:44.851 "zoned": false, 00:24:44.851 "supported_io_types": { 00:24:44.851 "read": true, 00:24:44.851 "write": true, 00:24:44.851 "unmap": true, 00:24:44.851 "flush": true, 00:24:44.851 "reset": true, 00:24:44.851 "nvme_admin": false, 00:24:44.851 "nvme_io": false, 00:24:44.851 "nvme_io_md": false, 00:24:44.851 "write_zeroes": true, 00:24:44.851 "zcopy": true, 00:24:44.851 "get_zone_info": false, 00:24:44.851 "zone_management": false, 00:24:44.851 "zone_append": false, 00:24:44.851 "compare": false, 00:24:44.851 "compare_and_write": false, 00:24:44.851 "abort": true, 00:24:44.851 "seek_hole": false, 00:24:44.851 "seek_data": false, 00:24:44.851 "copy": true, 00:24:44.851 "nvme_iov_md": false 00:24:44.851 }, 00:24:44.851 "memory_domains": [ 00:24:44.851 { 00:24:44.851 "dma_device_id": "system", 00:24:44.851 "dma_device_type": 1 00:24:44.851 }, 00:24:44.851 { 00:24:44.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:44.851 "dma_device_type": 2 00:24:44.851 } 00:24:44.851 ], 00:24:44.851 "driver_specific": { 00:24:44.851 "passthru": { 00:24:44.851 "name": "pt1", 00:24:44.851 "base_bdev_name": "malloc1" 00:24:44.851 } 00:24:44.851 } 00:24:44.851 }' 00:24:44.851 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:45.111 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:45.371 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:45.371 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:45.371 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:45.371 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:45.371 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:45.371 "name": "pt2", 00:24:45.371 "aliases": [ 00:24:45.371 "00000000-0000-0000-0000-000000000002" 00:24:45.371 ], 00:24:45.371 "product_name": "passthru", 00:24:45.371 "block_size": 4096, 00:24:45.371 "num_blocks": 8192, 00:24:45.371 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:45.371 "md_size": 32, 00:24:45.371 "md_interleave": false, 00:24:45.371 "dif_type": 0, 00:24:45.371 "assigned_rate_limits": { 00:24:45.371 "rw_ios_per_sec": 0, 00:24:45.371 "rw_mbytes_per_sec": 0, 00:24:45.371 "r_mbytes_per_sec": 0, 00:24:45.371 "w_mbytes_per_sec": 0 00:24:45.371 }, 00:24:45.371 "claimed": true, 00:24:45.371 "claim_type": "exclusive_write", 00:24:45.371 "zoned": false, 00:24:45.371 "supported_io_types": { 00:24:45.371 "read": true, 00:24:45.371 "write": true, 00:24:45.371 "unmap": true, 00:24:45.371 "flush": true, 00:24:45.371 "reset": true, 00:24:45.371 "nvme_admin": false, 00:24:45.371 "nvme_io": false, 00:24:45.371 "nvme_io_md": false, 00:24:45.371 "write_zeroes": true, 00:24:45.371 "zcopy": true, 00:24:45.371 "get_zone_info": false, 00:24:45.371 "zone_management": false, 00:24:45.371 "zone_append": false, 00:24:45.371 "compare": false, 00:24:45.371 "compare_and_write": false, 00:24:45.371 "abort": true, 00:24:45.371 "seek_hole": false, 00:24:45.371 "seek_data": false, 00:24:45.371 "copy": true, 00:24:45.371 "nvme_iov_md": false 00:24:45.371 }, 00:24:45.371 "memory_domains": [ 00:24:45.371 { 00:24:45.371 "dma_device_id": "system", 00:24:45.371 "dma_device_type": 1 00:24:45.371 }, 00:24:45.371 { 00:24:45.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:45.371 "dma_device_type": 2 00:24:45.371 } 00:24:45.371 ], 00:24:45.371 "driver_specific": { 00:24:45.371 "passthru": { 00:24:45.371 "name": "pt2", 00:24:45.371 "base_bdev_name": "malloc2" 00:24:45.371 } 00:24:45.371 } 00:24:45.371 }' 00:24:45.371 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:45.631 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:45.631 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:45.631 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:45.631 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:45.631 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:45.631 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:45.631 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:45.631 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:24:45.631 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:45.891 17:36:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:45.891 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:45.891 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:45.891 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:24:46.150 [2024-07-15 17:36:57.193843] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:46.150 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 9ec03c56-25db-4ebc-8a04-871ba5ca31b0 '!=' 9ec03c56-25db-4ebc-8a04-871ba5ca31b0 ']' 00:24:46.150 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:24:46.150 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:46.150 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:46.151 [2024-07-15 17:36:57.386140] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.151 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.411 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:46.411 "name": "raid_bdev1", 00:24:46.411 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:46.411 "strip_size_kb": 0, 00:24:46.411 "state": "online", 00:24:46.411 "raid_level": "raid1", 00:24:46.411 "superblock": true, 00:24:46.411 "num_base_bdevs": 2, 00:24:46.411 "num_base_bdevs_discovered": 1, 00:24:46.411 "num_base_bdevs_operational": 1, 00:24:46.411 "base_bdevs_list": [ 00:24:46.411 { 00:24:46.411 "name": null, 00:24:46.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.411 "is_configured": false, 00:24:46.411 "data_offset": 256, 00:24:46.411 "data_size": 7936 00:24:46.411 }, 00:24:46.411 { 00:24:46.411 "name": "pt2", 00:24:46.411 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:46.411 "is_configured": true, 00:24:46.411 "data_offset": 256, 00:24:46.411 "data_size": 7936 00:24:46.411 } 00:24:46.411 ] 00:24:46.411 }' 00:24:46.411 17:36:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:46.411 17:36:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:46.981 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:47.241 [2024-07-15 17:36:58.324490] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:47.241 [2024-07-15 17:36:58.324507] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:47.241 [2024-07-15 17:36:58.324542] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:47.241 [2024-07-15 17:36:58.324575] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:47.241 [2024-07-15 17:36:58.324581] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa1df0 name raid_bdev1, state offline 00:24:47.241 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.241 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:24:47.241 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:24:47.241 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:24:47.241 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:24:47.241 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:47.241 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:47.501 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:47.501 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:47.501 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:24:47.501 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:47.501 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:24:47.501 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:47.761 [2024-07-15 17:36:58.901930] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:47.761 [2024-07-15 17:36:58.901957] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:47.761 [2024-07-15 17:36:58.901966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x91e5f0 00:24:47.761 [2024-07-15 17:36:58.901972] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:47.761 [2024-07-15 17:36:58.903300] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:47.761 [2024-07-15 17:36:58.903320] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:47.761 [2024-07-15 17:36:58.903352] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:47.761 [2024-07-15 17:36:58.903373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:47.761 [2024-07-15 17:36:58.903432] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaa37a0 00:24:47.761 [2024-07-15 17:36:58.903437] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:47.761 [2024-07-15 17:36:58.903480] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaa2b30 00:24:47.761 [2024-07-15 17:36:58.903556] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaa37a0 00:24:47.761 [2024-07-15 17:36:58.903561] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xaa37a0 00:24:47.761 [2024-07-15 17:36:58.903609] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:47.761 pt2 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:47.761 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:47.762 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.762 17:36:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.022 17:36:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:48.022 "name": "raid_bdev1", 00:24:48.022 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:48.022 "strip_size_kb": 0, 00:24:48.022 "state": "online", 00:24:48.022 "raid_level": "raid1", 00:24:48.022 "superblock": true, 00:24:48.022 "num_base_bdevs": 2, 00:24:48.022 "num_base_bdevs_discovered": 1, 00:24:48.022 "num_base_bdevs_operational": 1, 00:24:48.022 "base_bdevs_list": [ 00:24:48.022 { 00:24:48.022 "name": null, 00:24:48.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.022 "is_configured": false, 00:24:48.022 "data_offset": 256, 00:24:48.022 "data_size": 7936 00:24:48.022 }, 00:24:48.022 { 00:24:48.022 "name": "pt2", 00:24:48.022 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:48.022 "is_configured": true, 00:24:48.022 "data_offset": 256, 00:24:48.022 "data_size": 7936 00:24:48.022 } 00:24:48.022 ] 00:24:48.022 }' 00:24:48.022 17:36:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:48.022 17:36:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:48.592 17:36:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:48.592 [2024-07-15 17:36:59.808213] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:48.592 [2024-07-15 17:36:59.808226] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:48.592 [2024-07-15 17:36:59.808258] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:48.592 [2024-07-15 17:36:59.808286] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:48.592 [2024-07-15 17:36:59.808291] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa37a0 name raid_bdev1, state offline 00:24:48.592 17:36:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:24:48.592 17:36:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.852 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:24:48.852 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:24:48.852 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:24:48.852 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:49.113 [2024-07-15 17:37:00.201200] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:49.113 [2024-07-15 17:37:00.201237] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:49.113 [2024-07-15 17:37:00.201254] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaa2b80 00:24:49.113 [2024-07-15 17:37:00.201260] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:49.113 [2024-07-15 17:37:00.202410] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:49.113 [2024-07-15 17:37:00.202429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:49.113 [2024-07-15 17:37:00.202461] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:49.113 [2024-07-15 17:37:00.202479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:49.113 [2024-07-15 17:37:00.202550] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:24:49.113 [2024-07-15 17:37:00.202557] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:49.113 [2024-07-15 17:37:00.202566] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa26d0 name raid_bdev1, state configuring 00:24:49.113 [2024-07-15 17:37:00.202579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:49.113 [2024-07-15 17:37:00.202618] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaa2070 00:24:49.113 [2024-07-15 17:37:00.202624] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:49.113 [2024-07-15 17:37:00.202664] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaa2870 00:24:49.113 [2024-07-15 17:37:00.202747] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaa2070 00:24:49.113 [2024-07-15 17:37:00.202752] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xaa2070 00:24:49.113 [2024-07-15 17:37:00.202804] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:49.113 pt1 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.113 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.374 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:49.374 "name": "raid_bdev1", 00:24:49.374 "uuid": "9ec03c56-25db-4ebc-8a04-871ba5ca31b0", 00:24:49.374 "strip_size_kb": 0, 00:24:49.374 "state": "online", 00:24:49.374 "raid_level": "raid1", 00:24:49.374 "superblock": true, 00:24:49.374 "num_base_bdevs": 2, 00:24:49.374 "num_base_bdevs_discovered": 1, 00:24:49.374 "num_base_bdevs_operational": 1, 00:24:49.374 "base_bdevs_list": [ 00:24:49.374 { 00:24:49.374 "name": null, 00:24:49.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.374 "is_configured": false, 00:24:49.374 "data_offset": 256, 00:24:49.374 "data_size": 7936 00:24:49.374 }, 00:24:49.374 { 00:24:49.374 "name": "pt2", 00:24:49.374 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:49.374 "is_configured": true, 00:24:49.374 "data_offset": 256, 00:24:49.374 "data_size": 7936 00:24:49.374 } 00:24:49.374 ] 00:24:49.374 }' 00:24:49.374 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:49.374 17:37:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:49.944 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:24:49.944 17:37:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:49.944 17:37:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:24:49.944 17:37:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:49.944 17:37:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:24:50.205 [2024-07-15 17:37:01.368375] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 9ec03c56-25db-4ebc-8a04-871ba5ca31b0 '!=' 9ec03c56-25db-4ebc-8a04-871ba5ca31b0 ']' 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 2903949 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2903949 ']' 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 2903949 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2903949 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2903949' 00:24:50.205 killing process with pid 2903949 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 2903949 00:24:50.205 [2024-07-15 17:37:01.435590] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:50.205 [2024-07-15 17:37:01.435625] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:50.205 [2024-07-15 17:37:01.435654] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:50.205 [2024-07-15 17:37:01.435660] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa2070 name raid_bdev1, state offline 00:24:50.205 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 2903949 00:24:50.205 [2024-07-15 17:37:01.448026] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:50.466 17:37:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:24:50.466 00:24:50.466 real 0m13.188s 00:24:50.466 user 0m24.440s 00:24:50.466 sys 0m2.036s 00:24:50.466 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:50.466 17:37:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:50.466 ************************************ 00:24:50.466 END TEST raid_superblock_test_md_separate 00:24:50.466 ************************************ 00:24:50.466 17:37:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:50.466 17:37:01 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:24:50.466 17:37:01 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:24:50.466 17:37:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:50.466 17:37:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:50.466 17:37:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:50.466 ************************************ 00:24:50.466 START TEST raid_rebuild_test_sb_md_separate 00:24:50.466 ************************************ 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=2906411 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 2906411 /var/tmp/spdk-raid.sock 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2906411 ']' 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:50.466 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:50.466 17:37:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:50.466 [2024-07-15 17:37:01.707374] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:24:50.466 [2024-07-15 17:37:01.707423] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2906411 ] 00:24:50.466 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:50.466 Zero copy mechanism will not be used. 00:24:50.726 [2024-07-15 17:37:01.797225] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:50.726 [2024-07-15 17:37:01.875168] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:50.726 [2024-07-15 17:37:01.919115] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:50.726 [2024-07-15 17:37:01.919141] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:51.296 17:37:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:51.296 17:37:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:24:51.296 17:37:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:51.296 17:37:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:24:51.555 BaseBdev1_malloc 00:24:51.555 17:37:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:51.816 [2024-07-15 17:37:02.906534] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:51.816 [2024-07-15 17:37:02.906568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:51.816 [2024-07-15 17:37:02.906582] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26866c0 00:24:51.816 [2024-07-15 17:37:02.906589] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:51.816 [2024-07-15 17:37:02.907775] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:51.816 [2024-07-15 17:37:02.907794] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:51.816 BaseBdev1 00:24:51.816 17:37:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:51.816 17:37:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:24:51.816 BaseBdev2_malloc 00:24:52.074 17:37:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:52.074 [2024-07-15 17:37:03.277906] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:52.074 [2024-07-15 17:37:03.277932] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.074 [2024-07-15 17:37:03.277944] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2813c40 00:24:52.074 [2024-07-15 17:37:03.277950] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.074 [2024-07-15 17:37:03.279014] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.074 [2024-07-15 17:37:03.279031] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:52.074 BaseBdev2 00:24:52.074 17:37:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:24:52.333 spare_malloc 00:24:52.333 17:37:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:52.592 spare_delay 00:24:52.592 17:37:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:52.592 [2024-07-15 17:37:03.849743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:52.592 [2024-07-15 17:37:03.849772] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.592 [2024-07-15 17:37:03.849786] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2808cf0 00:24:52.592 [2024-07-15 17:37:03.849793] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.592 [2024-07-15 17:37:03.850880] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.592 [2024-07-15 17:37:03.850899] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:52.592 spare 00:24:52.592 17:37:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:52.853 [2024-07-15 17:37:04.042244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:52.853 [2024-07-15 17:37:04.043248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:52.853 [2024-07-15 17:37:04.043365] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28124c0 00:24:52.853 [2024-07-15 17:37:04.043373] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:52.853 [2024-07-15 17:37:04.043422] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x267c960 00:24:52.853 [2024-07-15 17:37:04.043508] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28124c0 00:24:52.853 [2024-07-15 17:37:04.043513] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28124c0 00:24:52.853 [2024-07-15 17:37:04.043563] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.853 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.113 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:53.113 "name": "raid_bdev1", 00:24:53.113 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:24:53.113 "strip_size_kb": 0, 00:24:53.113 "state": "online", 00:24:53.113 "raid_level": "raid1", 00:24:53.113 "superblock": true, 00:24:53.113 "num_base_bdevs": 2, 00:24:53.113 "num_base_bdevs_discovered": 2, 00:24:53.113 "num_base_bdevs_operational": 2, 00:24:53.113 "base_bdevs_list": [ 00:24:53.113 { 00:24:53.113 "name": "BaseBdev1", 00:24:53.113 "uuid": "6b100530-a490-5d8c-9fd8-179f383596ba", 00:24:53.113 "is_configured": true, 00:24:53.113 "data_offset": 256, 00:24:53.113 "data_size": 7936 00:24:53.113 }, 00:24:53.113 { 00:24:53.113 "name": "BaseBdev2", 00:24:53.113 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:24:53.113 "is_configured": true, 00:24:53.113 "data_offset": 256, 00:24:53.114 "data_size": 7936 00:24:53.114 } 00:24:53.114 ] 00:24:53.114 }' 00:24:53.114 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:53.114 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:53.684 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:53.684 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:53.684 [2024-07-15 17:37:04.980841] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:53.944 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:24:53.944 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.944 17:37:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.944 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:54.204 [2024-07-15 17:37:05.357611] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x267df90 00:24:54.204 /dev/nbd0 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:54.204 1+0 records in 00:24:54.204 1+0 records out 00:24:54.204 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233414 s, 17.5 MB/s 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:54.204 17:37:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:24:54.774 7936+0 records in 00:24:54.774 7936+0 records out 00:24:54.774 32505856 bytes (33 MB, 31 MiB) copied, 0.61903 s, 52.5 MB/s 00:24:54.774 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:54.774 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:54.774 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:54.774 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:54.774 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:24:54.774 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:54.774 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:55.035 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:55.035 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:55.035 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:55.035 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:55.035 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:55.035 [2024-07-15 17:37:06.221065] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:55.035 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:55.035 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:55.035 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:55.035 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:55.296 [2024-07-15 17:37:06.401554] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.296 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.556 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:55.556 "name": "raid_bdev1", 00:24:55.556 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:24:55.556 "strip_size_kb": 0, 00:24:55.556 "state": "online", 00:24:55.556 "raid_level": "raid1", 00:24:55.556 "superblock": true, 00:24:55.556 "num_base_bdevs": 2, 00:24:55.556 "num_base_bdevs_discovered": 1, 00:24:55.556 "num_base_bdevs_operational": 1, 00:24:55.556 "base_bdevs_list": [ 00:24:55.556 { 00:24:55.556 "name": null, 00:24:55.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.556 "is_configured": false, 00:24:55.556 "data_offset": 256, 00:24:55.556 "data_size": 7936 00:24:55.556 }, 00:24:55.556 { 00:24:55.556 "name": "BaseBdev2", 00:24:55.556 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:24:55.556 "is_configured": true, 00:24:55.556 "data_offset": 256, 00:24:55.556 "data_size": 7936 00:24:55.556 } 00:24:55.556 ] 00:24:55.556 }' 00:24:55.556 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:55.556 17:37:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:56.127 17:37:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:56.127 [2024-07-15 17:37:07.336016] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:56.127 [2024-07-15 17:37:07.337633] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x267d3c0 00:24:56.127 [2024-07-15 17:37:07.339205] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:56.127 17:37:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:57.091 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:57.091 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:57.091 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:57.091 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:57.091 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:57.091 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.091 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.379 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:57.379 "name": "raid_bdev1", 00:24:57.379 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:24:57.379 "strip_size_kb": 0, 00:24:57.379 "state": "online", 00:24:57.379 "raid_level": "raid1", 00:24:57.379 "superblock": true, 00:24:57.379 "num_base_bdevs": 2, 00:24:57.379 "num_base_bdevs_discovered": 2, 00:24:57.379 "num_base_bdevs_operational": 2, 00:24:57.379 "process": { 00:24:57.379 "type": "rebuild", 00:24:57.379 "target": "spare", 00:24:57.379 "progress": { 00:24:57.379 "blocks": 2816, 00:24:57.379 "percent": 35 00:24:57.379 } 00:24:57.379 }, 00:24:57.379 "base_bdevs_list": [ 00:24:57.379 { 00:24:57.379 "name": "spare", 00:24:57.379 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:24:57.379 "is_configured": true, 00:24:57.379 "data_offset": 256, 00:24:57.379 "data_size": 7936 00:24:57.379 }, 00:24:57.379 { 00:24:57.379 "name": "BaseBdev2", 00:24:57.380 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:24:57.380 "is_configured": true, 00:24:57.380 "data_offset": 256, 00:24:57.380 "data_size": 7936 00:24:57.380 } 00:24:57.380 ] 00:24:57.380 }' 00:24:57.380 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:57.380 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:57.380 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:57.380 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:57.380 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:57.639 [2024-07-15 17:37:08.804687] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:57.639 [2024-07-15 17:37:08.848236] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:57.639 [2024-07-15 17:37:08.848269] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:57.639 [2024-07-15 17:37:08.848279] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:57.639 [2024-07-15 17:37:08.848283] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.639 17:37:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.900 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:57.900 "name": "raid_bdev1", 00:24:57.900 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:24:57.900 "strip_size_kb": 0, 00:24:57.900 "state": "online", 00:24:57.900 "raid_level": "raid1", 00:24:57.900 "superblock": true, 00:24:57.900 "num_base_bdevs": 2, 00:24:57.900 "num_base_bdevs_discovered": 1, 00:24:57.900 "num_base_bdevs_operational": 1, 00:24:57.900 "base_bdevs_list": [ 00:24:57.900 { 00:24:57.900 "name": null, 00:24:57.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:57.900 "is_configured": false, 00:24:57.900 "data_offset": 256, 00:24:57.900 "data_size": 7936 00:24:57.900 }, 00:24:57.900 { 00:24:57.900 "name": "BaseBdev2", 00:24:57.900 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:24:57.900 "is_configured": true, 00:24:57.900 "data_offset": 256, 00:24:57.900 "data_size": 7936 00:24:57.900 } 00:24:57.900 ] 00:24:57.900 }' 00:24:57.900 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:57.900 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:58.471 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:58.472 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.472 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:58.472 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:58.472 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.472 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.472 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.744 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.744 "name": "raid_bdev1", 00:24:58.744 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:24:58.744 "strip_size_kb": 0, 00:24:58.744 "state": "online", 00:24:58.744 "raid_level": "raid1", 00:24:58.744 "superblock": true, 00:24:58.744 "num_base_bdevs": 2, 00:24:58.744 "num_base_bdevs_discovered": 1, 00:24:58.744 "num_base_bdevs_operational": 1, 00:24:58.744 "base_bdevs_list": [ 00:24:58.744 { 00:24:58.744 "name": null, 00:24:58.744 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.744 "is_configured": false, 00:24:58.744 "data_offset": 256, 00:24:58.744 "data_size": 7936 00:24:58.744 }, 00:24:58.744 { 00:24:58.744 "name": "BaseBdev2", 00:24:58.744 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:24:58.744 "is_configured": true, 00:24:58.744 "data_offset": 256, 00:24:58.744 "data_size": 7936 00:24:58.744 } 00:24:58.744 ] 00:24:58.744 }' 00:24:58.744 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.744 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:58.744 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.744 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:58.744 17:37:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:59.004 [2024-07-15 17:37:10.097326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:59.004 [2024-07-15 17:37:10.098959] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2685650 00:24:59.004 [2024-07-15 17:37:10.100080] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:59.004 17:37:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:59.943 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:59.943 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.943 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:59.943 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:59.943 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.943 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.943 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:00.203 "name": "raid_bdev1", 00:25:00.203 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:00.203 "strip_size_kb": 0, 00:25:00.203 "state": "online", 00:25:00.203 "raid_level": "raid1", 00:25:00.203 "superblock": true, 00:25:00.203 "num_base_bdevs": 2, 00:25:00.203 "num_base_bdevs_discovered": 2, 00:25:00.203 "num_base_bdevs_operational": 2, 00:25:00.203 "process": { 00:25:00.203 "type": "rebuild", 00:25:00.203 "target": "spare", 00:25:00.203 "progress": { 00:25:00.203 "blocks": 2816, 00:25:00.203 "percent": 35 00:25:00.203 } 00:25:00.203 }, 00:25:00.203 "base_bdevs_list": [ 00:25:00.203 { 00:25:00.203 "name": "spare", 00:25:00.203 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:00.203 "is_configured": true, 00:25:00.203 "data_offset": 256, 00:25:00.203 "data_size": 7936 00:25:00.203 }, 00:25:00.203 { 00:25:00.203 "name": "BaseBdev2", 00:25:00.203 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:00.203 "is_configured": true, 00:25:00.203 "data_offset": 256, 00:25:00.203 "data_size": 7936 00:25:00.203 } 00:25:00.203 ] 00:25:00.203 }' 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:00.203 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=925 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.203 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.463 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:00.463 "name": "raid_bdev1", 00:25:00.463 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:00.463 "strip_size_kb": 0, 00:25:00.463 "state": "online", 00:25:00.463 "raid_level": "raid1", 00:25:00.463 "superblock": true, 00:25:00.463 "num_base_bdevs": 2, 00:25:00.463 "num_base_bdevs_discovered": 2, 00:25:00.463 "num_base_bdevs_operational": 2, 00:25:00.463 "process": { 00:25:00.463 "type": "rebuild", 00:25:00.463 "target": "spare", 00:25:00.463 "progress": { 00:25:00.463 "blocks": 3584, 00:25:00.463 "percent": 45 00:25:00.463 } 00:25:00.463 }, 00:25:00.463 "base_bdevs_list": [ 00:25:00.463 { 00:25:00.463 "name": "spare", 00:25:00.463 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:00.463 "is_configured": true, 00:25:00.463 "data_offset": 256, 00:25:00.463 "data_size": 7936 00:25:00.463 }, 00:25:00.463 { 00:25:00.463 "name": "BaseBdev2", 00:25:00.463 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:00.463 "is_configured": true, 00:25:00.463 "data_offset": 256, 00:25:00.463 "data_size": 7936 00:25:00.463 } 00:25:00.463 ] 00:25:00.463 }' 00:25:00.463 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.463 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:00.463 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.463 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:00.463 17:37:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:01.404 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:01.404 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:01.404 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.404 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:01.404 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:01.404 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.404 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.404 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.665 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.665 "name": "raid_bdev1", 00:25:01.665 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:01.665 "strip_size_kb": 0, 00:25:01.665 "state": "online", 00:25:01.665 "raid_level": "raid1", 00:25:01.665 "superblock": true, 00:25:01.665 "num_base_bdevs": 2, 00:25:01.665 "num_base_bdevs_discovered": 2, 00:25:01.665 "num_base_bdevs_operational": 2, 00:25:01.665 "process": { 00:25:01.665 "type": "rebuild", 00:25:01.665 "target": "spare", 00:25:01.665 "progress": { 00:25:01.665 "blocks": 6912, 00:25:01.665 "percent": 87 00:25:01.665 } 00:25:01.665 }, 00:25:01.665 "base_bdevs_list": [ 00:25:01.665 { 00:25:01.665 "name": "spare", 00:25:01.665 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:01.665 "is_configured": true, 00:25:01.665 "data_offset": 256, 00:25:01.665 "data_size": 7936 00:25:01.665 }, 00:25:01.665 { 00:25:01.665 "name": "BaseBdev2", 00:25:01.665 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:01.665 "is_configured": true, 00:25:01.665 "data_offset": 256, 00:25:01.665 "data_size": 7936 00:25:01.665 } 00:25:01.665 ] 00:25:01.665 }' 00:25:01.665 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.665 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.665 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.926 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.926 17:37:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:01.926 [2024-07-15 17:37:13.218473] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:01.926 [2024-07-15 17:37:13.218515] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:01.926 [2024-07-15 17:37:13.218579] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:02.867 17:37:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:02.867 17:37:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:02.867 17:37:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:02.867 17:37:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:02.867 17:37:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:02.867 17:37:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:02.867 17:37:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.867 17:37:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.128 "name": "raid_bdev1", 00:25:03.128 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:03.128 "strip_size_kb": 0, 00:25:03.128 "state": "online", 00:25:03.128 "raid_level": "raid1", 00:25:03.128 "superblock": true, 00:25:03.128 "num_base_bdevs": 2, 00:25:03.128 "num_base_bdevs_discovered": 2, 00:25:03.128 "num_base_bdevs_operational": 2, 00:25:03.128 "base_bdevs_list": [ 00:25:03.128 { 00:25:03.128 "name": "spare", 00:25:03.128 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:03.128 "is_configured": true, 00:25:03.128 "data_offset": 256, 00:25:03.128 "data_size": 7936 00:25:03.128 }, 00:25:03.128 { 00:25:03.128 "name": "BaseBdev2", 00:25:03.128 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:03.128 "is_configured": true, 00:25:03.128 "data_offset": 256, 00:25:03.128 "data_size": 7936 00:25:03.128 } 00:25:03.128 ] 00:25:03.128 }' 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.128 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.389 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.390 "name": "raid_bdev1", 00:25:03.390 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:03.390 "strip_size_kb": 0, 00:25:03.390 "state": "online", 00:25:03.390 "raid_level": "raid1", 00:25:03.390 "superblock": true, 00:25:03.390 "num_base_bdevs": 2, 00:25:03.390 "num_base_bdevs_discovered": 2, 00:25:03.390 "num_base_bdevs_operational": 2, 00:25:03.390 "base_bdevs_list": [ 00:25:03.390 { 00:25:03.390 "name": "spare", 00:25:03.390 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:03.390 "is_configured": true, 00:25:03.390 "data_offset": 256, 00:25:03.390 "data_size": 7936 00:25:03.390 }, 00:25:03.390 { 00:25:03.390 "name": "BaseBdev2", 00:25:03.390 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:03.390 "is_configured": true, 00:25:03.390 "data_offset": 256, 00:25:03.390 "data_size": 7936 00:25:03.390 } 00:25:03.390 ] 00:25:03.390 }' 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.390 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.650 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:03.650 "name": "raid_bdev1", 00:25:03.650 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:03.650 "strip_size_kb": 0, 00:25:03.650 "state": "online", 00:25:03.650 "raid_level": "raid1", 00:25:03.650 "superblock": true, 00:25:03.650 "num_base_bdevs": 2, 00:25:03.650 "num_base_bdevs_discovered": 2, 00:25:03.650 "num_base_bdevs_operational": 2, 00:25:03.650 "base_bdevs_list": [ 00:25:03.650 { 00:25:03.650 "name": "spare", 00:25:03.650 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:03.650 "is_configured": true, 00:25:03.650 "data_offset": 256, 00:25:03.650 "data_size": 7936 00:25:03.650 }, 00:25:03.650 { 00:25:03.650 "name": "BaseBdev2", 00:25:03.650 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:03.650 "is_configured": true, 00:25:03.650 "data_offset": 256, 00:25:03.650 "data_size": 7936 00:25:03.650 } 00:25:03.650 ] 00:25:03.650 }' 00:25:03.650 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:03.650 17:37:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:04.220 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:04.220 [2024-07-15 17:37:15.498068] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:04.220 [2024-07-15 17:37:15.498086] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:04.220 [2024-07-15 17:37:15.498124] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:04.220 [2024-07-15 17:37:15.498161] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:04.220 [2024-07-15 17:37:15.498168] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28124c0 name raid_bdev1, state offline 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:04.480 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:04.481 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:04.481 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:25:04.481 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:04.481 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:04.481 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:04.741 /dev/nbd0 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:04.741 1+0 records in 00:25:04.741 1+0 records out 00:25:04.741 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281278 s, 14.6 MB/s 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:04.741 17:37:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:05.001 /dev/nbd1 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:05.001 1+0 records in 00:25:05.001 1+0 records out 00:25:05.001 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342896 s, 11.9 MB/s 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:05.001 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:05.261 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:05.521 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:05.521 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:05.521 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:05.521 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:05.521 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:05.522 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:05.522 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:25:05.522 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:25:05.522 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:05.522 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:05.782 17:37:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:05.782 [2024-07-15 17:37:17.003968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:05.782 [2024-07-15 17:37:17.003999] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:05.782 [2024-07-15 17:37:17.004012] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x267dd90 00:25:05.782 [2024-07-15 17:37:17.004018] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:05.782 [2024-07-15 17:37:17.005189] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:05.782 [2024-07-15 17:37:17.005211] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:05.782 [2024-07-15 17:37:17.005250] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:05.782 [2024-07-15 17:37:17.005269] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:05.782 [2024-07-15 17:37:17.005341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:05.782 spare 00:25:05.782 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:05.782 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:05.782 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:05.782 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:05.783 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:05.783 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:05.783 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:05.783 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:05.783 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:05.783 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:05.783 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.783 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.044 [2024-07-15 17:37:17.105627] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x267e320 00:25:06.044 [2024-07-15 17:37:17.105636] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:06.044 [2024-07-15 17:37:17.105687] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x267f6b0 00:25:06.044 [2024-07-15 17:37:17.105784] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x267e320 00:25:06.044 [2024-07-15 17:37:17.105790] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x267e320 00:25:06.044 [2024-07-15 17:37:17.105846] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:06.045 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:06.045 "name": "raid_bdev1", 00:25:06.045 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:06.045 "strip_size_kb": 0, 00:25:06.045 "state": "online", 00:25:06.045 "raid_level": "raid1", 00:25:06.045 "superblock": true, 00:25:06.045 "num_base_bdevs": 2, 00:25:06.045 "num_base_bdevs_discovered": 2, 00:25:06.045 "num_base_bdevs_operational": 2, 00:25:06.045 "base_bdevs_list": [ 00:25:06.045 { 00:25:06.045 "name": "spare", 00:25:06.045 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:06.045 "is_configured": true, 00:25:06.045 "data_offset": 256, 00:25:06.045 "data_size": 7936 00:25:06.045 }, 00:25:06.045 { 00:25:06.045 "name": "BaseBdev2", 00:25:06.045 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:06.045 "is_configured": true, 00:25:06.045 "data_offset": 256, 00:25:06.045 "data_size": 7936 00:25:06.045 } 00:25:06.045 ] 00:25:06.045 }' 00:25:06.045 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:06.045 17:37:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:06.985 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:06.985 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.985 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:06.985 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:06.985 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.985 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.985 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.245 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:07.245 "name": "raid_bdev1", 00:25:07.245 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:07.245 "strip_size_kb": 0, 00:25:07.245 "state": "online", 00:25:07.245 "raid_level": "raid1", 00:25:07.245 "superblock": true, 00:25:07.245 "num_base_bdevs": 2, 00:25:07.245 "num_base_bdevs_discovered": 2, 00:25:07.245 "num_base_bdevs_operational": 2, 00:25:07.245 "base_bdevs_list": [ 00:25:07.245 { 00:25:07.245 "name": "spare", 00:25:07.245 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:07.245 "is_configured": true, 00:25:07.245 "data_offset": 256, 00:25:07.245 "data_size": 7936 00:25:07.245 }, 00:25:07.245 { 00:25:07.245 "name": "BaseBdev2", 00:25:07.246 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:07.246 "is_configured": true, 00:25:07.246 "data_offset": 256, 00:25:07.246 "data_size": 7936 00:25:07.246 } 00:25:07.246 ] 00:25:07.246 }' 00:25:07.246 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:07.246 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:07.246 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:07.246 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:07.246 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.246 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:07.506 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:07.506 17:37:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:08.077 [2024-07-15 17:37:19.109421] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:08.077 "name": "raid_bdev1", 00:25:08.077 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:08.077 "strip_size_kb": 0, 00:25:08.077 "state": "online", 00:25:08.077 "raid_level": "raid1", 00:25:08.077 "superblock": true, 00:25:08.077 "num_base_bdevs": 2, 00:25:08.077 "num_base_bdevs_discovered": 1, 00:25:08.077 "num_base_bdevs_operational": 1, 00:25:08.077 "base_bdevs_list": [ 00:25:08.077 { 00:25:08.077 "name": null, 00:25:08.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:08.077 "is_configured": false, 00:25:08.077 "data_offset": 256, 00:25:08.077 "data_size": 7936 00:25:08.077 }, 00:25:08.077 { 00:25:08.077 "name": "BaseBdev2", 00:25:08.077 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:08.077 "is_configured": true, 00:25:08.077 "data_offset": 256, 00:25:08.077 "data_size": 7936 00:25:08.077 } 00:25:08.077 ] 00:25:08.077 }' 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:08.077 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:08.647 17:37:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:08.907 [2024-07-15 17:37:20.011772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.907 [2024-07-15 17:37:20.011917] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:08.907 [2024-07-15 17:37:20.011936] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:08.907 [2024-07-15 17:37:20.011961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.907 [2024-07-15 17:37:20.014045] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x267f6b0 00:25:08.907 [2024-07-15 17:37:20.015936] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:08.907 17:37:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:09.850 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.850 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.850 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.850 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.850 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.850 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.850 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.110 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:10.110 "name": "raid_bdev1", 00:25:10.110 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:10.110 "strip_size_kb": 0, 00:25:10.110 "state": "online", 00:25:10.110 "raid_level": "raid1", 00:25:10.110 "superblock": true, 00:25:10.110 "num_base_bdevs": 2, 00:25:10.110 "num_base_bdevs_discovered": 2, 00:25:10.110 "num_base_bdevs_operational": 2, 00:25:10.110 "process": { 00:25:10.110 "type": "rebuild", 00:25:10.110 "target": "spare", 00:25:10.110 "progress": { 00:25:10.110 "blocks": 2816, 00:25:10.110 "percent": 35 00:25:10.110 } 00:25:10.110 }, 00:25:10.110 "base_bdevs_list": [ 00:25:10.110 { 00:25:10.110 "name": "spare", 00:25:10.110 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:10.110 "is_configured": true, 00:25:10.110 "data_offset": 256, 00:25:10.110 "data_size": 7936 00:25:10.110 }, 00:25:10.110 { 00:25:10.110 "name": "BaseBdev2", 00:25:10.110 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:10.110 "is_configured": true, 00:25:10.110 "data_offset": 256, 00:25:10.110 "data_size": 7936 00:25:10.110 } 00:25:10.110 ] 00:25:10.110 }' 00:25:10.110 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.110 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:10.110 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.110 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.110 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:10.699 [2024-07-15 17:37:21.801669] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:10.699 [2024-07-15 17:37:21.826625] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:10.699 [2024-07-15 17:37:21.826658] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:10.699 [2024-07-15 17:37:21.826668] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:10.699 [2024-07-15 17:37:21.826672] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.699 17:37:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.960 17:37:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.960 "name": "raid_bdev1", 00:25:10.960 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:10.960 "strip_size_kb": 0, 00:25:10.960 "state": "online", 00:25:10.960 "raid_level": "raid1", 00:25:10.960 "superblock": true, 00:25:10.960 "num_base_bdevs": 2, 00:25:10.960 "num_base_bdevs_discovered": 1, 00:25:10.960 "num_base_bdevs_operational": 1, 00:25:10.960 "base_bdevs_list": [ 00:25:10.960 { 00:25:10.960 "name": null, 00:25:10.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.961 "is_configured": false, 00:25:10.961 "data_offset": 256, 00:25:10.961 "data_size": 7936 00:25:10.961 }, 00:25:10.961 { 00:25:10.961 "name": "BaseBdev2", 00:25:10.961 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:10.961 "is_configured": true, 00:25:10.961 "data_offset": 256, 00:25:10.961 "data_size": 7936 00:25:10.961 } 00:25:10.961 ] 00:25:10.961 }' 00:25:10.961 17:37:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.961 17:37:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:11.534 17:37:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:11.534 [2024-07-15 17:37:22.760071] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:11.534 [2024-07-15 17:37:22.760103] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:11.534 [2024-07-15 17:37:22.760119] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26801d0 00:25:11.534 [2024-07-15 17:37:22.760125] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:11.534 [2024-07-15 17:37:22.760294] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:11.534 [2024-07-15 17:37:22.760304] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:11.534 [2024-07-15 17:37:22.760343] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:11.534 [2024-07-15 17:37:22.760349] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:11.534 [2024-07-15 17:37:22.760355] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:11.534 [2024-07-15 17:37:22.760366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:11.534 [2024-07-15 17:37:22.761912] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2684830 00:25:11.534 [2024-07-15 17:37:22.763035] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:11.534 spare 00:25:11.534 17:37:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:12.530 17:37:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:12.530 17:37:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.530 17:37:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:12.530 17:37:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:12.530 17:37:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.530 17:37:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.530 17:37:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.788 17:37:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.788 "name": "raid_bdev1", 00:25:12.788 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:12.788 "strip_size_kb": 0, 00:25:12.788 "state": "online", 00:25:12.788 "raid_level": "raid1", 00:25:12.788 "superblock": true, 00:25:12.788 "num_base_bdevs": 2, 00:25:12.788 "num_base_bdevs_discovered": 2, 00:25:12.788 "num_base_bdevs_operational": 2, 00:25:12.788 "process": { 00:25:12.788 "type": "rebuild", 00:25:12.788 "target": "spare", 00:25:12.788 "progress": { 00:25:12.788 "blocks": 2816, 00:25:12.788 "percent": 35 00:25:12.788 } 00:25:12.788 }, 00:25:12.788 "base_bdevs_list": [ 00:25:12.788 { 00:25:12.788 "name": "spare", 00:25:12.788 "uuid": "b3d2379f-7622-57a5-b745-0f1978c78964", 00:25:12.788 "is_configured": true, 00:25:12.788 "data_offset": 256, 00:25:12.788 "data_size": 7936 00:25:12.788 }, 00:25:12.788 { 00:25:12.788 "name": "BaseBdev2", 00:25:12.788 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:12.788 "is_configured": true, 00:25:12.788 "data_offset": 256, 00:25:12.788 "data_size": 7936 00:25:12.788 } 00:25:12.788 ] 00:25:12.788 }' 00:25:12.788 17:37:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.788 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:12.788 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:12.788 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:12.788 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:13.047 [2024-07-15 17:37:24.253195] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:13.047 [2024-07-15 17:37:24.271941] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:13.047 [2024-07-15 17:37:24.271970] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:13.047 [2024-07-15 17:37:24.271979] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:13.047 [2024-07-15 17:37:24.271984] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:13.047 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:13.047 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:13.047 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:13.047 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:13.047 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:13.047 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:13.047 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:13.047 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:13.048 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:13.048 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:13.048 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.048 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.308 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:13.308 "name": "raid_bdev1", 00:25:13.308 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:13.308 "strip_size_kb": 0, 00:25:13.308 "state": "online", 00:25:13.308 "raid_level": "raid1", 00:25:13.308 "superblock": true, 00:25:13.308 "num_base_bdevs": 2, 00:25:13.308 "num_base_bdevs_discovered": 1, 00:25:13.308 "num_base_bdevs_operational": 1, 00:25:13.308 "base_bdevs_list": [ 00:25:13.308 { 00:25:13.308 "name": null, 00:25:13.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.308 "is_configured": false, 00:25:13.308 "data_offset": 256, 00:25:13.308 "data_size": 7936 00:25:13.308 }, 00:25:13.308 { 00:25:13.308 "name": "BaseBdev2", 00:25:13.308 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:13.308 "is_configured": true, 00:25:13.308 "data_offset": 256, 00:25:13.308 "data_size": 7936 00:25:13.308 } 00:25:13.308 ] 00:25:13.308 }' 00:25:13.308 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:13.308 17:37:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:13.877 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.877 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.877 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.877 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.877 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.877 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.877 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.137 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:14.137 "name": "raid_bdev1", 00:25:14.137 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:14.137 "strip_size_kb": 0, 00:25:14.137 "state": "online", 00:25:14.137 "raid_level": "raid1", 00:25:14.137 "superblock": true, 00:25:14.137 "num_base_bdevs": 2, 00:25:14.137 "num_base_bdevs_discovered": 1, 00:25:14.137 "num_base_bdevs_operational": 1, 00:25:14.137 "base_bdevs_list": [ 00:25:14.137 { 00:25:14.137 "name": null, 00:25:14.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.137 "is_configured": false, 00:25:14.137 "data_offset": 256, 00:25:14.137 "data_size": 7936 00:25:14.137 }, 00:25:14.137 { 00:25:14.137 "name": "BaseBdev2", 00:25:14.137 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:14.137 "is_configured": true, 00:25:14.137 "data_offset": 256, 00:25:14.137 "data_size": 7936 00:25:14.137 } 00:25:14.137 ] 00:25:14.137 }' 00:25:14.137 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:14.137 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:14.137 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:14.137 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:14.137 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:14.397 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:14.397 [2024-07-15 17:37:25.625247] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:14.397 [2024-07-15 17:37:25.625274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:14.397 [2024-07-15 17:37:25.625285] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26868f0 00:25:14.397 [2024-07-15 17:37:25.625291] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:14.397 [2024-07-15 17:37:25.625436] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:14.397 [2024-07-15 17:37:25.625445] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:14.397 [2024-07-15 17:37:25.625474] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:14.397 [2024-07-15 17:37:25.625480] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:14.397 [2024-07-15 17:37:25.625490] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:14.397 BaseBdev1 00:25:14.397 17:37:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:15.779 "name": "raid_bdev1", 00:25:15.779 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:15.779 "strip_size_kb": 0, 00:25:15.779 "state": "online", 00:25:15.779 "raid_level": "raid1", 00:25:15.779 "superblock": true, 00:25:15.779 "num_base_bdevs": 2, 00:25:15.779 "num_base_bdevs_discovered": 1, 00:25:15.779 "num_base_bdevs_operational": 1, 00:25:15.779 "base_bdevs_list": [ 00:25:15.779 { 00:25:15.779 "name": null, 00:25:15.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.779 "is_configured": false, 00:25:15.779 "data_offset": 256, 00:25:15.779 "data_size": 7936 00:25:15.779 }, 00:25:15.779 { 00:25:15.779 "name": "BaseBdev2", 00:25:15.779 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:15.779 "is_configured": true, 00:25:15.779 "data_offset": 256, 00:25:15.779 "data_size": 7936 00:25:15.779 } 00:25:15.779 ] 00:25:15.779 }' 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:15.779 17:37:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:16.349 "name": "raid_bdev1", 00:25:16.349 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:16.349 "strip_size_kb": 0, 00:25:16.349 "state": "online", 00:25:16.349 "raid_level": "raid1", 00:25:16.349 "superblock": true, 00:25:16.349 "num_base_bdevs": 2, 00:25:16.349 "num_base_bdevs_discovered": 1, 00:25:16.349 "num_base_bdevs_operational": 1, 00:25:16.349 "base_bdevs_list": [ 00:25:16.349 { 00:25:16.349 "name": null, 00:25:16.349 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:16.349 "is_configured": false, 00:25:16.349 "data_offset": 256, 00:25:16.349 "data_size": 7936 00:25:16.349 }, 00:25:16.349 { 00:25:16.349 "name": "BaseBdev2", 00:25:16.349 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:16.349 "is_configured": true, 00:25:16.349 "data_offset": 256, 00:25:16.349 "data_size": 7936 00:25:16.349 } 00:25:16.349 ] 00:25:16.349 }' 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:16.349 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:16.609 [2024-07-15 17:37:27.846983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:16.609 [2024-07-15 17:37:27.847072] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:16.609 [2024-07-15 17:37:27.847080] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:16.609 request: 00:25:16.609 { 00:25:16.609 "base_bdev": "BaseBdev1", 00:25:16.609 "raid_bdev": "raid_bdev1", 00:25:16.609 "method": "bdev_raid_add_base_bdev", 00:25:16.609 "req_id": 1 00:25:16.609 } 00:25:16.609 Got JSON-RPC error response 00:25:16.609 response: 00:25:16.609 { 00:25:16.609 "code": -22, 00:25:16.609 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:16.609 } 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:16.609 17:37:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.991 17:37:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.991 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:17.991 "name": "raid_bdev1", 00:25:17.991 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:17.991 "strip_size_kb": 0, 00:25:17.991 "state": "online", 00:25:17.991 "raid_level": "raid1", 00:25:17.991 "superblock": true, 00:25:17.991 "num_base_bdevs": 2, 00:25:17.991 "num_base_bdevs_discovered": 1, 00:25:17.991 "num_base_bdevs_operational": 1, 00:25:17.991 "base_bdevs_list": [ 00:25:17.991 { 00:25:17.991 "name": null, 00:25:17.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.991 "is_configured": false, 00:25:17.991 "data_offset": 256, 00:25:17.991 "data_size": 7936 00:25:17.991 }, 00:25:17.991 { 00:25:17.991 "name": "BaseBdev2", 00:25:17.991 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:17.991 "is_configured": true, 00:25:17.991 "data_offset": 256, 00:25:17.991 "data_size": 7936 00:25:17.991 } 00:25:17.991 ] 00:25:17.991 }' 00:25:17.991 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:17.991 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:18.561 "name": "raid_bdev1", 00:25:18.561 "uuid": "f9497c15-4454-4545-9257-c6819634ab9f", 00:25:18.561 "strip_size_kb": 0, 00:25:18.561 "state": "online", 00:25:18.561 "raid_level": "raid1", 00:25:18.561 "superblock": true, 00:25:18.561 "num_base_bdevs": 2, 00:25:18.561 "num_base_bdevs_discovered": 1, 00:25:18.561 "num_base_bdevs_operational": 1, 00:25:18.561 "base_bdevs_list": [ 00:25:18.561 { 00:25:18.561 "name": null, 00:25:18.561 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.561 "is_configured": false, 00:25:18.561 "data_offset": 256, 00:25:18.561 "data_size": 7936 00:25:18.561 }, 00:25:18.561 { 00:25:18.561 "name": "BaseBdev2", 00:25:18.561 "uuid": "320d511b-b128-5f50-a67e-58cc73637162", 00:25:18.561 "is_configured": true, 00:25:18.561 "data_offset": 256, 00:25:18.561 "data_size": 7936 00:25:18.561 } 00:25:18.561 ] 00:25:18.561 }' 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:18.561 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:18.821 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:18.821 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 2906411 00:25:18.821 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2906411 ']' 00:25:18.821 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2906411 00:25:18.821 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:25:18.821 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:18.821 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2906411 00:25:18.822 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:18.822 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:18.822 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2906411' 00:25:18.822 killing process with pid 2906411 00:25:18.822 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2906411 00:25:18.822 Received shutdown signal, test time was about 60.000000 seconds 00:25:18.822 00:25:18.822 Latency(us) 00:25:18.822 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:18.822 =================================================================================================================== 00:25:18.822 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:18.822 [2024-07-15 17:37:29.916820] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:18.822 [2024-07-15 17:37:29.916885] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:18.822 [2024-07-15 17:37:29.916914] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:18.822 [2024-07-15 17:37:29.916921] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x267e320 name raid_bdev1, state offline 00:25:18.822 17:37:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2906411 00:25:18.822 [2024-07-15 17:37:29.935416] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:18.822 17:37:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:25:18.822 00:25:18.822 real 0m28.411s 00:25:18.822 user 0m44.990s 00:25:18.822 sys 0m3.474s 00:25:18.822 17:37:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:18.822 17:37:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:18.822 ************************************ 00:25:18.822 END TEST raid_rebuild_test_sb_md_separate 00:25:18.822 ************************************ 00:25:18.822 17:37:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:18.822 17:37:30 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:25:18.822 17:37:30 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:25:18.822 17:37:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:18.822 17:37:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:18.822 17:37:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:19.083 ************************************ 00:25:19.083 START TEST raid_state_function_test_sb_md_interleaved 00:25:19.083 ************************************ 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=2911615 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2911615' 00:25:19.083 Process raid pid: 2911615 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 2911615 /var/tmp/spdk-raid.sock 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2911615 ']' 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:19.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:19.083 17:37:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:19.083 [2024-07-15 17:37:30.192220] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:25:19.083 [2024-07-15 17:37:30.192266] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:19.083 [2024-07-15 17:37:30.280374] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:19.083 [2024-07-15 17:37:30.347254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:19.344 [2024-07-15 17:37:30.400041] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:19.344 [2024-07-15 17:37:30.400068] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:19.914 [2024-07-15 17:37:31.192133] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:19.914 [2024-07-15 17:37:31.192159] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:19.914 [2024-07-15 17:37:31.192165] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:19.914 [2024-07-15 17:37:31.192171] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:19.914 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:20.174 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.174 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:20.174 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:20.174 "name": "Existed_Raid", 00:25:20.174 "uuid": "aa4e3998-a6d0-40d8-b1e0-17422e90c4ac", 00:25:20.174 "strip_size_kb": 0, 00:25:20.174 "state": "configuring", 00:25:20.174 "raid_level": "raid1", 00:25:20.174 "superblock": true, 00:25:20.174 "num_base_bdevs": 2, 00:25:20.174 "num_base_bdevs_discovered": 0, 00:25:20.174 "num_base_bdevs_operational": 2, 00:25:20.174 "base_bdevs_list": [ 00:25:20.174 { 00:25:20.174 "name": "BaseBdev1", 00:25:20.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.174 "is_configured": false, 00:25:20.174 "data_offset": 0, 00:25:20.174 "data_size": 0 00:25:20.174 }, 00:25:20.174 { 00:25:20.174 "name": "BaseBdev2", 00:25:20.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.174 "is_configured": false, 00:25:20.174 "data_offset": 0, 00:25:20.174 "data_size": 0 00:25:20.174 } 00:25:20.174 ] 00:25:20.174 }' 00:25:20.174 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:20.174 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:20.745 17:37:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:21.006 [2024-07-15 17:37:32.126397] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:21.006 [2024-07-15 17:37:32.126413] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x97d6b0 name Existed_Raid, state configuring 00:25:21.006 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:21.266 [2024-07-15 17:37:32.322905] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:21.266 [2024-07-15 17:37:32.322922] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:21.266 [2024-07-15 17:37:32.322928] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:21.266 [2024-07-15 17:37:32.322933] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:21.266 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:25:21.266 [2024-07-15 17:37:32.525863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:21.266 BaseBdev1 00:25:21.266 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:25:21.266 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:25:21.266 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:21.266 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:25:21.266 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:21.266 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:21.266 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:21.525 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:21.784 [ 00:25:21.784 { 00:25:21.784 "name": "BaseBdev1", 00:25:21.784 "aliases": [ 00:25:21.784 "25367e62-de0e-4ad2-90d8-638a798199c9" 00:25:21.784 ], 00:25:21.784 "product_name": "Malloc disk", 00:25:21.784 "block_size": 4128, 00:25:21.784 "num_blocks": 8192, 00:25:21.784 "uuid": "25367e62-de0e-4ad2-90d8-638a798199c9", 00:25:21.784 "md_size": 32, 00:25:21.784 "md_interleave": true, 00:25:21.784 "dif_type": 0, 00:25:21.784 "assigned_rate_limits": { 00:25:21.784 "rw_ios_per_sec": 0, 00:25:21.784 "rw_mbytes_per_sec": 0, 00:25:21.784 "r_mbytes_per_sec": 0, 00:25:21.784 "w_mbytes_per_sec": 0 00:25:21.784 }, 00:25:21.784 "claimed": true, 00:25:21.784 "claim_type": "exclusive_write", 00:25:21.784 "zoned": false, 00:25:21.784 "supported_io_types": { 00:25:21.785 "read": true, 00:25:21.785 "write": true, 00:25:21.785 "unmap": true, 00:25:21.785 "flush": true, 00:25:21.785 "reset": true, 00:25:21.785 "nvme_admin": false, 00:25:21.785 "nvme_io": false, 00:25:21.785 "nvme_io_md": false, 00:25:21.785 "write_zeroes": true, 00:25:21.785 "zcopy": true, 00:25:21.785 "get_zone_info": false, 00:25:21.785 "zone_management": false, 00:25:21.785 "zone_append": false, 00:25:21.785 "compare": false, 00:25:21.785 "compare_and_write": false, 00:25:21.785 "abort": true, 00:25:21.785 "seek_hole": false, 00:25:21.785 "seek_data": false, 00:25:21.785 "copy": true, 00:25:21.785 "nvme_iov_md": false 00:25:21.785 }, 00:25:21.785 "memory_domains": [ 00:25:21.785 { 00:25:21.785 "dma_device_id": "system", 00:25:21.785 "dma_device_type": 1 00:25:21.785 }, 00:25:21.785 { 00:25:21.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:21.785 "dma_device_type": 2 00:25:21.785 } 00:25:21.785 ], 00:25:21.785 "driver_specific": {} 00:25:21.785 } 00:25:21.785 ] 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.785 17:37:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:22.046 17:37:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:22.046 "name": "Existed_Raid", 00:25:22.046 "uuid": "d50361ef-fa06-4458-bf78-ba7e2a61eef0", 00:25:22.046 "strip_size_kb": 0, 00:25:22.046 "state": "configuring", 00:25:22.046 "raid_level": "raid1", 00:25:22.046 "superblock": true, 00:25:22.046 "num_base_bdevs": 2, 00:25:22.046 "num_base_bdevs_discovered": 1, 00:25:22.046 "num_base_bdevs_operational": 2, 00:25:22.046 "base_bdevs_list": [ 00:25:22.046 { 00:25:22.046 "name": "BaseBdev1", 00:25:22.046 "uuid": "25367e62-de0e-4ad2-90d8-638a798199c9", 00:25:22.046 "is_configured": true, 00:25:22.046 "data_offset": 256, 00:25:22.046 "data_size": 7936 00:25:22.046 }, 00:25:22.046 { 00:25:22.046 "name": "BaseBdev2", 00:25:22.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:22.046 "is_configured": false, 00:25:22.046 "data_offset": 0, 00:25:22.046 "data_size": 0 00:25:22.046 } 00:25:22.046 ] 00:25:22.046 }' 00:25:22.046 17:37:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:22.046 17:37:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:22.616 17:37:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:22.616 [2024-07-15 17:37:33.845218] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:22.616 [2024-07-15 17:37:33.845243] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x97cfa0 name Existed_Raid, state configuring 00:25:22.616 17:37:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:22.876 [2024-07-15 17:37:34.041745] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:22.876 [2024-07-15 17:37:34.042874] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:22.876 [2024-07-15 17:37:34.042898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:22.876 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:22.876 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:22.876 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:22.876 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:22.876 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:22.877 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:22.877 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:22.877 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:22.877 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:22.877 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:22.877 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:22.877 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:22.877 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.877 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:23.136 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.136 "name": "Existed_Raid", 00:25:23.136 "uuid": "c11a4fbc-3066-4419-8b3f-f79cb6a6ab05", 00:25:23.136 "strip_size_kb": 0, 00:25:23.136 "state": "configuring", 00:25:23.136 "raid_level": "raid1", 00:25:23.136 "superblock": true, 00:25:23.136 "num_base_bdevs": 2, 00:25:23.136 "num_base_bdevs_discovered": 1, 00:25:23.136 "num_base_bdevs_operational": 2, 00:25:23.136 "base_bdevs_list": [ 00:25:23.136 { 00:25:23.136 "name": "BaseBdev1", 00:25:23.136 "uuid": "25367e62-de0e-4ad2-90d8-638a798199c9", 00:25:23.136 "is_configured": true, 00:25:23.136 "data_offset": 256, 00:25:23.136 "data_size": 7936 00:25:23.136 }, 00:25:23.136 { 00:25:23.136 "name": "BaseBdev2", 00:25:23.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.136 "is_configured": false, 00:25:23.136 "data_offset": 0, 00:25:23.136 "data_size": 0 00:25:23.136 } 00:25:23.136 ] 00:25:23.136 }' 00:25:23.136 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.136 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:23.710 17:37:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:25:23.710 [2024-07-15 17:37:34.993134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:23.710 [2024-07-15 17:37:34.993228] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x97ee40 00:25:23.710 [2024-07-15 17:37:34.993235] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:23.710 [2024-07-15 17:37:34.993277] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x97c6a0 00:25:23.710 [2024-07-15 17:37:34.993338] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x97ee40 00:25:23.710 [2024-07-15 17:37:34.993343] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x97ee40 00:25:23.710 [2024-07-15 17:37:34.993385] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:23.710 BaseBdev2 00:25:23.710 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:23.710 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:25:23.710 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:23.710 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:25:23.710 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:23.710 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:23.710 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:23.971 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:24.255 [ 00:25:24.255 { 00:25:24.255 "name": "BaseBdev2", 00:25:24.255 "aliases": [ 00:25:24.255 "7ad2a805-2e3d-4d97-8494-5e7c3d2958c2" 00:25:24.255 ], 00:25:24.255 "product_name": "Malloc disk", 00:25:24.255 "block_size": 4128, 00:25:24.255 "num_blocks": 8192, 00:25:24.255 "uuid": "7ad2a805-2e3d-4d97-8494-5e7c3d2958c2", 00:25:24.255 "md_size": 32, 00:25:24.255 "md_interleave": true, 00:25:24.255 "dif_type": 0, 00:25:24.255 "assigned_rate_limits": { 00:25:24.255 "rw_ios_per_sec": 0, 00:25:24.255 "rw_mbytes_per_sec": 0, 00:25:24.255 "r_mbytes_per_sec": 0, 00:25:24.255 "w_mbytes_per_sec": 0 00:25:24.255 }, 00:25:24.255 "claimed": true, 00:25:24.255 "claim_type": "exclusive_write", 00:25:24.255 "zoned": false, 00:25:24.255 "supported_io_types": { 00:25:24.255 "read": true, 00:25:24.255 "write": true, 00:25:24.255 "unmap": true, 00:25:24.255 "flush": true, 00:25:24.255 "reset": true, 00:25:24.255 "nvme_admin": false, 00:25:24.255 "nvme_io": false, 00:25:24.255 "nvme_io_md": false, 00:25:24.255 "write_zeroes": true, 00:25:24.255 "zcopy": true, 00:25:24.255 "get_zone_info": false, 00:25:24.255 "zone_management": false, 00:25:24.255 "zone_append": false, 00:25:24.255 "compare": false, 00:25:24.255 "compare_and_write": false, 00:25:24.255 "abort": true, 00:25:24.255 "seek_hole": false, 00:25:24.255 "seek_data": false, 00:25:24.255 "copy": true, 00:25:24.255 "nvme_iov_md": false 00:25:24.255 }, 00:25:24.255 "memory_domains": [ 00:25:24.255 { 00:25:24.255 "dma_device_id": "system", 00:25:24.255 "dma_device_type": 1 00:25:24.255 }, 00:25:24.255 { 00:25:24.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:24.255 "dma_device_type": 2 00:25:24.255 } 00:25:24.255 ], 00:25:24.255 "driver_specific": {} 00:25:24.255 } 00:25:24.255 ] 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.255 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:24.516 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:24.516 "name": "Existed_Raid", 00:25:24.516 "uuid": "c11a4fbc-3066-4419-8b3f-f79cb6a6ab05", 00:25:24.516 "strip_size_kb": 0, 00:25:24.516 "state": "online", 00:25:24.516 "raid_level": "raid1", 00:25:24.516 "superblock": true, 00:25:24.516 "num_base_bdevs": 2, 00:25:24.516 "num_base_bdevs_discovered": 2, 00:25:24.516 "num_base_bdevs_operational": 2, 00:25:24.516 "base_bdevs_list": [ 00:25:24.516 { 00:25:24.516 "name": "BaseBdev1", 00:25:24.516 "uuid": "25367e62-de0e-4ad2-90d8-638a798199c9", 00:25:24.516 "is_configured": true, 00:25:24.516 "data_offset": 256, 00:25:24.516 "data_size": 7936 00:25:24.516 }, 00:25:24.516 { 00:25:24.516 "name": "BaseBdev2", 00:25:24.516 "uuid": "7ad2a805-2e3d-4d97-8494-5e7c3d2958c2", 00:25:24.516 "is_configured": true, 00:25:24.516 "data_offset": 256, 00:25:24.516 "data_size": 7936 00:25:24.516 } 00:25:24.516 ] 00:25:24.516 }' 00:25:24.516 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:24.516 17:37:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:25.087 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:25:25.087 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:25.087 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:25.087 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:25.087 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:25.087 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:25:25.088 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:25.088 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:25.088 [2024-07-15 17:37:36.244540] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:25.088 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:25.088 "name": "Existed_Raid", 00:25:25.088 "aliases": [ 00:25:25.088 "c11a4fbc-3066-4419-8b3f-f79cb6a6ab05" 00:25:25.088 ], 00:25:25.088 "product_name": "Raid Volume", 00:25:25.088 "block_size": 4128, 00:25:25.088 "num_blocks": 7936, 00:25:25.088 "uuid": "c11a4fbc-3066-4419-8b3f-f79cb6a6ab05", 00:25:25.088 "md_size": 32, 00:25:25.088 "md_interleave": true, 00:25:25.088 "dif_type": 0, 00:25:25.088 "assigned_rate_limits": { 00:25:25.088 "rw_ios_per_sec": 0, 00:25:25.088 "rw_mbytes_per_sec": 0, 00:25:25.088 "r_mbytes_per_sec": 0, 00:25:25.088 "w_mbytes_per_sec": 0 00:25:25.088 }, 00:25:25.088 "claimed": false, 00:25:25.088 "zoned": false, 00:25:25.088 "supported_io_types": { 00:25:25.088 "read": true, 00:25:25.088 "write": true, 00:25:25.088 "unmap": false, 00:25:25.088 "flush": false, 00:25:25.088 "reset": true, 00:25:25.088 "nvme_admin": false, 00:25:25.088 "nvme_io": false, 00:25:25.088 "nvme_io_md": false, 00:25:25.088 "write_zeroes": true, 00:25:25.088 "zcopy": false, 00:25:25.088 "get_zone_info": false, 00:25:25.088 "zone_management": false, 00:25:25.088 "zone_append": false, 00:25:25.088 "compare": false, 00:25:25.088 "compare_and_write": false, 00:25:25.088 "abort": false, 00:25:25.088 "seek_hole": false, 00:25:25.088 "seek_data": false, 00:25:25.088 "copy": false, 00:25:25.088 "nvme_iov_md": false 00:25:25.088 }, 00:25:25.088 "memory_domains": [ 00:25:25.088 { 00:25:25.088 "dma_device_id": "system", 00:25:25.088 "dma_device_type": 1 00:25:25.088 }, 00:25:25.088 { 00:25:25.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:25.088 "dma_device_type": 2 00:25:25.088 }, 00:25:25.088 { 00:25:25.088 "dma_device_id": "system", 00:25:25.088 "dma_device_type": 1 00:25:25.088 }, 00:25:25.088 { 00:25:25.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:25.088 "dma_device_type": 2 00:25:25.088 } 00:25:25.088 ], 00:25:25.088 "driver_specific": { 00:25:25.088 "raid": { 00:25:25.088 "uuid": "c11a4fbc-3066-4419-8b3f-f79cb6a6ab05", 00:25:25.088 "strip_size_kb": 0, 00:25:25.088 "state": "online", 00:25:25.088 "raid_level": "raid1", 00:25:25.088 "superblock": true, 00:25:25.088 "num_base_bdevs": 2, 00:25:25.088 "num_base_bdevs_discovered": 2, 00:25:25.088 "num_base_bdevs_operational": 2, 00:25:25.088 "base_bdevs_list": [ 00:25:25.088 { 00:25:25.088 "name": "BaseBdev1", 00:25:25.088 "uuid": "25367e62-de0e-4ad2-90d8-638a798199c9", 00:25:25.088 "is_configured": true, 00:25:25.088 "data_offset": 256, 00:25:25.088 "data_size": 7936 00:25:25.088 }, 00:25:25.088 { 00:25:25.088 "name": "BaseBdev2", 00:25:25.088 "uuid": "7ad2a805-2e3d-4d97-8494-5e7c3d2958c2", 00:25:25.088 "is_configured": true, 00:25:25.088 "data_offset": 256, 00:25:25.088 "data_size": 7936 00:25:25.088 } 00:25:25.088 ] 00:25:25.088 } 00:25:25.088 } 00:25:25.088 }' 00:25:25.088 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:25.088 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:25:25.088 BaseBdev2' 00:25:25.088 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:25.088 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:25.088 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:25.348 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:25.348 "name": "BaseBdev1", 00:25:25.348 "aliases": [ 00:25:25.348 "25367e62-de0e-4ad2-90d8-638a798199c9" 00:25:25.348 ], 00:25:25.348 "product_name": "Malloc disk", 00:25:25.348 "block_size": 4128, 00:25:25.348 "num_blocks": 8192, 00:25:25.348 "uuid": "25367e62-de0e-4ad2-90d8-638a798199c9", 00:25:25.348 "md_size": 32, 00:25:25.348 "md_interleave": true, 00:25:25.348 "dif_type": 0, 00:25:25.348 "assigned_rate_limits": { 00:25:25.348 "rw_ios_per_sec": 0, 00:25:25.348 "rw_mbytes_per_sec": 0, 00:25:25.348 "r_mbytes_per_sec": 0, 00:25:25.348 "w_mbytes_per_sec": 0 00:25:25.348 }, 00:25:25.348 "claimed": true, 00:25:25.348 "claim_type": "exclusive_write", 00:25:25.348 "zoned": false, 00:25:25.348 "supported_io_types": { 00:25:25.348 "read": true, 00:25:25.348 "write": true, 00:25:25.348 "unmap": true, 00:25:25.348 "flush": true, 00:25:25.348 "reset": true, 00:25:25.348 "nvme_admin": false, 00:25:25.348 "nvme_io": false, 00:25:25.348 "nvme_io_md": false, 00:25:25.349 "write_zeroes": true, 00:25:25.349 "zcopy": true, 00:25:25.349 "get_zone_info": false, 00:25:25.349 "zone_management": false, 00:25:25.349 "zone_append": false, 00:25:25.349 "compare": false, 00:25:25.349 "compare_and_write": false, 00:25:25.349 "abort": true, 00:25:25.349 "seek_hole": false, 00:25:25.349 "seek_data": false, 00:25:25.349 "copy": true, 00:25:25.349 "nvme_iov_md": false 00:25:25.349 }, 00:25:25.349 "memory_domains": [ 00:25:25.349 { 00:25:25.349 "dma_device_id": "system", 00:25:25.349 "dma_device_type": 1 00:25:25.349 }, 00:25:25.349 { 00:25:25.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:25.349 "dma_device_type": 2 00:25:25.349 } 00:25:25.349 ], 00:25:25.349 "driver_specific": {} 00:25:25.349 }' 00:25:25.349 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:25.349 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:25.349 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:25.349 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:25.610 17:37:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:25.870 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:25.870 "name": "BaseBdev2", 00:25:25.870 "aliases": [ 00:25:25.870 "7ad2a805-2e3d-4d97-8494-5e7c3d2958c2" 00:25:25.870 ], 00:25:25.870 "product_name": "Malloc disk", 00:25:25.870 "block_size": 4128, 00:25:25.870 "num_blocks": 8192, 00:25:25.870 "uuid": "7ad2a805-2e3d-4d97-8494-5e7c3d2958c2", 00:25:25.870 "md_size": 32, 00:25:25.870 "md_interleave": true, 00:25:25.870 "dif_type": 0, 00:25:25.870 "assigned_rate_limits": { 00:25:25.870 "rw_ios_per_sec": 0, 00:25:25.870 "rw_mbytes_per_sec": 0, 00:25:25.870 "r_mbytes_per_sec": 0, 00:25:25.870 "w_mbytes_per_sec": 0 00:25:25.870 }, 00:25:25.870 "claimed": true, 00:25:25.870 "claim_type": "exclusive_write", 00:25:25.870 "zoned": false, 00:25:25.870 "supported_io_types": { 00:25:25.870 "read": true, 00:25:25.870 "write": true, 00:25:25.870 "unmap": true, 00:25:25.870 "flush": true, 00:25:25.870 "reset": true, 00:25:25.870 "nvme_admin": false, 00:25:25.870 "nvme_io": false, 00:25:25.870 "nvme_io_md": false, 00:25:25.870 "write_zeroes": true, 00:25:25.870 "zcopy": true, 00:25:25.870 "get_zone_info": false, 00:25:25.870 "zone_management": false, 00:25:25.870 "zone_append": false, 00:25:25.870 "compare": false, 00:25:25.870 "compare_and_write": false, 00:25:25.870 "abort": true, 00:25:25.870 "seek_hole": false, 00:25:25.870 "seek_data": false, 00:25:25.870 "copy": true, 00:25:25.870 "nvme_iov_md": false 00:25:25.870 }, 00:25:25.870 "memory_domains": [ 00:25:25.870 { 00:25:25.870 "dma_device_id": "system", 00:25:25.870 "dma_device_type": 1 00:25:25.870 }, 00:25:25.870 { 00:25:25.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:25.870 "dma_device_type": 2 00:25:25.870 } 00:25:25.870 ], 00:25:25.870 "driver_specific": {} 00:25:25.870 }' 00:25:25.870 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:25.870 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:25.870 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:25.870 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:26.130 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:26.130 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:26.130 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:26.130 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:26.130 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:26.130 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:26.130 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:26.130 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:26.130 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:26.390 [2024-07-15 17:37:37.551656] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:26.390 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:26.391 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.391 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:26.660 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.660 "name": "Existed_Raid", 00:25:26.660 "uuid": "c11a4fbc-3066-4419-8b3f-f79cb6a6ab05", 00:25:26.660 "strip_size_kb": 0, 00:25:26.660 "state": "online", 00:25:26.660 "raid_level": "raid1", 00:25:26.660 "superblock": true, 00:25:26.660 "num_base_bdevs": 2, 00:25:26.660 "num_base_bdevs_discovered": 1, 00:25:26.660 "num_base_bdevs_operational": 1, 00:25:26.660 "base_bdevs_list": [ 00:25:26.660 { 00:25:26.660 "name": null, 00:25:26.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.660 "is_configured": false, 00:25:26.660 "data_offset": 256, 00:25:26.660 "data_size": 7936 00:25:26.660 }, 00:25:26.660 { 00:25:26.660 "name": "BaseBdev2", 00:25:26.660 "uuid": "7ad2a805-2e3d-4d97-8494-5e7c3d2958c2", 00:25:26.660 "is_configured": true, 00:25:26.660 "data_offset": 256, 00:25:26.660 "data_size": 7936 00:25:26.660 } 00:25:26.660 ] 00:25:26.660 }' 00:25:26.660 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.660 17:37:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:27.280 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:25:27.280 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:27.280 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.280 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:25:27.280 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:25:27.280 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:27.280 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:27.541 [2024-07-15 17:37:38.690558] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:27.541 [2024-07-15 17:37:38.690618] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:27.541 [2024-07-15 17:37:38.696871] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:27.541 [2024-07-15 17:37:38.696895] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:27.541 [2024-07-15 17:37:38.696901] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x97ee40 name Existed_Raid, state offline 00:25:27.541 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:25:27.541 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:27.541 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.541 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 2911615 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2911615 ']' 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2911615 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2911615 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2911615' 00:25:27.802 killing process with pid 2911615 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2911615 00:25:27.802 [2024-07-15 17:37:38.952511] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:27.802 17:37:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2911615 00:25:27.802 [2024-07-15 17:37:38.953112] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:27.802 17:37:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:25:27.802 00:25:27.802 real 0m8.938s 00:25:27.802 user 0m16.234s 00:25:27.802 sys 0m1.352s 00:25:27.802 17:37:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:27.802 17:37:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:27.802 ************************************ 00:25:27.802 END TEST raid_state_function_test_sb_md_interleaved 00:25:27.802 ************************************ 00:25:28.063 17:37:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:28.063 17:37:39 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:25:28.063 17:37:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:25:28.063 17:37:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:28.063 17:37:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:28.063 ************************************ 00:25:28.063 START TEST raid_superblock_test_md_interleaved 00:25:28.063 ************************************ 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=2913244 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 2913244 /var/tmp/spdk-raid.sock 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2913244 ']' 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:28.063 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:28.063 17:37:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:28.063 [2024-07-15 17:37:39.202235] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:25:28.063 [2024-07-15 17:37:39.202285] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2913244 ] 00:25:28.063 [2024-07-15 17:37:39.291784] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:28.063 [2024-07-15 17:37:39.357797] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:28.324 [2024-07-15 17:37:39.402639] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:28.324 [2024-07-15 17:37:39.402658] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:25:28.994 malloc1 00:25:28.994 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:29.254 [2024-07-15 17:37:40.392982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:29.254 [2024-07-15 17:37:40.393018] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.254 [2024-07-15 17:37:40.393029] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xee3370 00:25:29.254 [2024-07-15 17:37:40.393035] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.254 [2024-07-15 17:37:40.394188] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.254 [2024-07-15 17:37:40.394206] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:29.254 pt1 00:25:29.254 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:29.254 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:29.254 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:25:29.254 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:25:29.254 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:29.254 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:29.254 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:29.254 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:29.254 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:25:29.515 malloc2 00:25:29.515 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:29.515 [2024-07-15 17:37:40.792039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:29.515 [2024-07-15 17:37:40.792065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.515 [2024-07-15 17:37:40.792074] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1070b10 00:25:29.515 [2024-07-15 17:37:40.792080] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.515 [2024-07-15 17:37:40.793119] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.515 [2024-07-15 17:37:40.793136] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:29.515 pt2 00:25:29.515 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:29.515 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:29.515 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:25:29.775 [2024-07-15 17:37:40.968492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:29.775 [2024-07-15 17:37:40.969570] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:29.775 [2024-07-15 17:37:40.969683] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1072480 00:25:29.775 [2024-07-15 17:37:40.969692] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:29.775 [2024-07-15 17:37:40.969743] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xee1560 00:25:29.775 [2024-07-15 17:37:40.969805] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1072480 00:25:29.775 [2024-07-15 17:37:40.969811] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1072480 00:25:29.775 [2024-07-15 17:37:40.969849] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.775 17:37:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.035 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:30.035 "name": "raid_bdev1", 00:25:30.035 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:30.035 "strip_size_kb": 0, 00:25:30.035 "state": "online", 00:25:30.035 "raid_level": "raid1", 00:25:30.035 "superblock": true, 00:25:30.035 "num_base_bdevs": 2, 00:25:30.035 "num_base_bdevs_discovered": 2, 00:25:30.035 "num_base_bdevs_operational": 2, 00:25:30.035 "base_bdevs_list": [ 00:25:30.035 { 00:25:30.035 "name": "pt1", 00:25:30.035 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:30.035 "is_configured": true, 00:25:30.035 "data_offset": 256, 00:25:30.035 "data_size": 7936 00:25:30.035 }, 00:25:30.035 { 00:25:30.035 "name": "pt2", 00:25:30.035 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:30.035 "is_configured": true, 00:25:30.035 "data_offset": 256, 00:25:30.035 "data_size": 7936 00:25:30.035 } 00:25:30.035 ] 00:25:30.035 }' 00:25:30.035 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:30.035 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:30.606 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:25:30.606 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:30.606 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:30.606 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:30.606 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:30.606 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:25:30.606 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:30.606 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:30.606 [2024-07-15 17:37:41.887007] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:30.606 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:30.606 "name": "raid_bdev1", 00:25:30.606 "aliases": [ 00:25:30.606 "0c47a2d2-631e-4510-8402-d9dc8e281cca" 00:25:30.606 ], 00:25:30.606 "product_name": "Raid Volume", 00:25:30.606 "block_size": 4128, 00:25:30.606 "num_blocks": 7936, 00:25:30.606 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:30.606 "md_size": 32, 00:25:30.606 "md_interleave": true, 00:25:30.606 "dif_type": 0, 00:25:30.606 "assigned_rate_limits": { 00:25:30.606 "rw_ios_per_sec": 0, 00:25:30.606 "rw_mbytes_per_sec": 0, 00:25:30.606 "r_mbytes_per_sec": 0, 00:25:30.606 "w_mbytes_per_sec": 0 00:25:30.606 }, 00:25:30.606 "claimed": false, 00:25:30.606 "zoned": false, 00:25:30.606 "supported_io_types": { 00:25:30.606 "read": true, 00:25:30.606 "write": true, 00:25:30.606 "unmap": false, 00:25:30.606 "flush": false, 00:25:30.606 "reset": true, 00:25:30.606 "nvme_admin": false, 00:25:30.606 "nvme_io": false, 00:25:30.606 "nvme_io_md": false, 00:25:30.606 "write_zeroes": true, 00:25:30.606 "zcopy": false, 00:25:30.606 "get_zone_info": false, 00:25:30.606 "zone_management": false, 00:25:30.606 "zone_append": false, 00:25:30.606 "compare": false, 00:25:30.606 "compare_and_write": false, 00:25:30.606 "abort": false, 00:25:30.606 "seek_hole": false, 00:25:30.606 "seek_data": false, 00:25:30.606 "copy": false, 00:25:30.606 "nvme_iov_md": false 00:25:30.606 }, 00:25:30.606 "memory_domains": [ 00:25:30.606 { 00:25:30.606 "dma_device_id": "system", 00:25:30.606 "dma_device_type": 1 00:25:30.606 }, 00:25:30.606 { 00:25:30.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:30.606 "dma_device_type": 2 00:25:30.606 }, 00:25:30.606 { 00:25:30.606 "dma_device_id": "system", 00:25:30.606 "dma_device_type": 1 00:25:30.606 }, 00:25:30.606 { 00:25:30.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:30.606 "dma_device_type": 2 00:25:30.606 } 00:25:30.606 ], 00:25:30.606 "driver_specific": { 00:25:30.606 "raid": { 00:25:30.606 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:30.606 "strip_size_kb": 0, 00:25:30.606 "state": "online", 00:25:30.606 "raid_level": "raid1", 00:25:30.606 "superblock": true, 00:25:30.606 "num_base_bdevs": 2, 00:25:30.606 "num_base_bdevs_discovered": 2, 00:25:30.606 "num_base_bdevs_operational": 2, 00:25:30.606 "base_bdevs_list": [ 00:25:30.606 { 00:25:30.606 "name": "pt1", 00:25:30.606 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:30.606 "is_configured": true, 00:25:30.606 "data_offset": 256, 00:25:30.606 "data_size": 7936 00:25:30.606 }, 00:25:30.606 { 00:25:30.606 "name": "pt2", 00:25:30.606 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:30.606 "is_configured": true, 00:25:30.606 "data_offset": 256, 00:25:30.606 "data_size": 7936 00:25:30.606 } 00:25:30.606 ] 00:25:30.606 } 00:25:30.606 } 00:25:30.606 }' 00:25:30.867 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:30.867 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:30.867 pt2' 00:25:30.867 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:30.867 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:30.867 17:37:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:30.867 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:30.867 "name": "pt1", 00:25:30.867 "aliases": [ 00:25:30.867 "00000000-0000-0000-0000-000000000001" 00:25:30.867 ], 00:25:30.867 "product_name": "passthru", 00:25:30.867 "block_size": 4128, 00:25:30.867 "num_blocks": 8192, 00:25:30.867 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:30.867 "md_size": 32, 00:25:30.867 "md_interleave": true, 00:25:30.867 "dif_type": 0, 00:25:30.867 "assigned_rate_limits": { 00:25:30.867 "rw_ios_per_sec": 0, 00:25:30.867 "rw_mbytes_per_sec": 0, 00:25:30.867 "r_mbytes_per_sec": 0, 00:25:30.867 "w_mbytes_per_sec": 0 00:25:30.867 }, 00:25:30.867 "claimed": true, 00:25:30.867 "claim_type": "exclusive_write", 00:25:30.867 "zoned": false, 00:25:30.867 "supported_io_types": { 00:25:30.867 "read": true, 00:25:30.867 "write": true, 00:25:30.867 "unmap": true, 00:25:30.867 "flush": true, 00:25:30.867 "reset": true, 00:25:30.867 "nvme_admin": false, 00:25:30.867 "nvme_io": false, 00:25:30.867 "nvme_io_md": false, 00:25:30.867 "write_zeroes": true, 00:25:30.867 "zcopy": true, 00:25:30.867 "get_zone_info": false, 00:25:30.867 "zone_management": false, 00:25:30.867 "zone_append": false, 00:25:30.867 "compare": false, 00:25:30.867 "compare_and_write": false, 00:25:30.867 "abort": true, 00:25:30.867 "seek_hole": false, 00:25:30.867 "seek_data": false, 00:25:30.867 "copy": true, 00:25:30.867 "nvme_iov_md": false 00:25:30.867 }, 00:25:30.867 "memory_domains": [ 00:25:30.867 { 00:25:30.867 "dma_device_id": "system", 00:25:30.867 "dma_device_type": 1 00:25:30.867 }, 00:25:30.867 { 00:25:30.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:30.867 "dma_device_type": 2 00:25:30.867 } 00:25:30.867 ], 00:25:30.867 "driver_specific": { 00:25:30.867 "passthru": { 00:25:30.867 "name": "pt1", 00:25:30.867 "base_bdev_name": "malloc1" 00:25:30.867 } 00:25:30.867 } 00:25:30.867 }' 00:25:30.867 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:31.126 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:31.387 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:31.387 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:31.387 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:31.387 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:31.387 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:31.387 "name": "pt2", 00:25:31.387 "aliases": [ 00:25:31.387 "00000000-0000-0000-0000-000000000002" 00:25:31.387 ], 00:25:31.387 "product_name": "passthru", 00:25:31.387 "block_size": 4128, 00:25:31.387 "num_blocks": 8192, 00:25:31.387 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:31.387 "md_size": 32, 00:25:31.387 "md_interleave": true, 00:25:31.387 "dif_type": 0, 00:25:31.387 "assigned_rate_limits": { 00:25:31.387 "rw_ios_per_sec": 0, 00:25:31.387 "rw_mbytes_per_sec": 0, 00:25:31.387 "r_mbytes_per_sec": 0, 00:25:31.387 "w_mbytes_per_sec": 0 00:25:31.387 }, 00:25:31.387 "claimed": true, 00:25:31.387 "claim_type": "exclusive_write", 00:25:31.387 "zoned": false, 00:25:31.387 "supported_io_types": { 00:25:31.387 "read": true, 00:25:31.387 "write": true, 00:25:31.387 "unmap": true, 00:25:31.387 "flush": true, 00:25:31.387 "reset": true, 00:25:31.387 "nvme_admin": false, 00:25:31.387 "nvme_io": false, 00:25:31.387 "nvme_io_md": false, 00:25:31.387 "write_zeroes": true, 00:25:31.387 "zcopy": true, 00:25:31.387 "get_zone_info": false, 00:25:31.387 "zone_management": false, 00:25:31.387 "zone_append": false, 00:25:31.387 "compare": false, 00:25:31.387 "compare_and_write": false, 00:25:31.387 "abort": true, 00:25:31.387 "seek_hole": false, 00:25:31.387 "seek_data": false, 00:25:31.387 "copy": true, 00:25:31.387 "nvme_iov_md": false 00:25:31.387 }, 00:25:31.387 "memory_domains": [ 00:25:31.387 { 00:25:31.387 "dma_device_id": "system", 00:25:31.387 "dma_device_type": 1 00:25:31.387 }, 00:25:31.387 { 00:25:31.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:31.387 "dma_device_type": 2 00:25:31.387 } 00:25:31.387 ], 00:25:31.387 "driver_specific": { 00:25:31.387 "passthru": { 00:25:31.387 "name": "pt2", 00:25:31.387 "base_bdev_name": "malloc2" 00:25:31.387 } 00:25:31.387 } 00:25:31.387 }' 00:25:31.387 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:31.646 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:31.646 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:31.646 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:31.646 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:31.646 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:31.646 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:31.646 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:31.646 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:31.646 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:31.909 17:37:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:31.909 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:31.909 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:31.909 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:25:32.175 [2024-07-15 17:37:43.210363] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:32.175 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0c47a2d2-631e-4510-8402-d9dc8e281cca 00:25:32.175 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 0c47a2d2-631e-4510-8402-d9dc8e281cca ']' 00:25:32.175 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:32.175 [2024-07-15 17:37:43.410667] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:32.175 [2024-07-15 17:37:43.410677] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:32.175 [2024-07-15 17:37:43.410718] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:32.175 [2024-07-15 17:37:43.410757] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:32.175 [2024-07-15 17:37:43.410763] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1072480 name raid_bdev1, state offline 00:25:32.175 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.175 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:25:32.435 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:25:32.435 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:25:32.435 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:32.435 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:32.696 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:32.696 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:32.956 17:37:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:32.956 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:33.216 [2024-07-15 17:37:44.373072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:33.216 [2024-07-15 17:37:44.374128] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:33.216 [2024-07-15 17:37:44.374169] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:33.216 [2024-07-15 17:37:44.374197] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:33.216 [2024-07-15 17:37:44.374210] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:33.216 [2024-07-15 17:37:44.374215] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x106f9c0 name raid_bdev1, state configuring 00:25:33.216 request: 00:25:33.216 { 00:25:33.216 "name": "raid_bdev1", 00:25:33.216 "raid_level": "raid1", 00:25:33.216 "base_bdevs": [ 00:25:33.216 "malloc1", 00:25:33.216 "malloc2" 00:25:33.216 ], 00:25:33.216 "superblock": false, 00:25:33.216 "method": "bdev_raid_create", 00:25:33.216 "req_id": 1 00:25:33.216 } 00:25:33.216 Got JSON-RPC error response 00:25:33.216 response: 00:25:33.216 { 00:25:33.216 "code": -17, 00:25:33.216 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:33.216 } 00:25:33.216 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:25:33.216 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:33.216 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:33.216 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:33.216 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.216 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:25:33.476 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:25:33.476 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:25:33.476 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:33.476 [2024-07-15 17:37:44.758000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:33.476 [2024-07-15 17:37:44.758024] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:33.476 [2024-07-15 17:37:44.758036] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xee1360 00:25:33.476 [2024-07-15 17:37:44.758042] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:33.476 [2024-07-15 17:37:44.759162] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:33.476 [2024-07-15 17:37:44.759180] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:33.476 [2024-07-15 17:37:44.759209] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:33.476 [2024-07-15 17:37:44.759226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:33.476 pt1 00:25:33.476 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:25:33.476 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:33.476 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:33.476 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:33.476 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.736 "name": "raid_bdev1", 00:25:33.736 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:33.736 "strip_size_kb": 0, 00:25:33.736 "state": "configuring", 00:25:33.736 "raid_level": "raid1", 00:25:33.736 "superblock": true, 00:25:33.736 "num_base_bdevs": 2, 00:25:33.736 "num_base_bdevs_discovered": 1, 00:25:33.736 "num_base_bdevs_operational": 2, 00:25:33.736 "base_bdevs_list": [ 00:25:33.736 { 00:25:33.736 "name": "pt1", 00:25:33.736 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:33.736 "is_configured": true, 00:25:33.736 "data_offset": 256, 00:25:33.736 "data_size": 7936 00:25:33.736 }, 00:25:33.736 { 00:25:33.736 "name": null, 00:25:33.736 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:33.736 "is_configured": false, 00:25:33.736 "data_offset": 256, 00:25:33.736 "data_size": 7936 00:25:33.736 } 00:25:33.736 ] 00:25:33.736 }' 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.736 17:37:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:34.307 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:25:34.307 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:25:34.307 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:34.307 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:34.567 [2024-07-15 17:37:45.696383] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:34.567 [2024-07-15 17:37:45.696411] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:34.567 [2024-07-15 17:37:45.696421] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10755f0 00:25:34.567 [2024-07-15 17:37:45.696427] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:34.567 [2024-07-15 17:37:45.696547] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:34.567 [2024-07-15 17:37:45.696556] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:34.567 [2024-07-15 17:37:45.696584] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:34.567 [2024-07-15 17:37:45.696595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:34.567 [2024-07-15 17:37:45.696660] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1073ea0 00:25:34.567 [2024-07-15 17:37:45.696666] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:34.567 [2024-07-15 17:37:45.696705] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1074ec0 00:25:34.567 [2024-07-15 17:37:45.696770] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1073ea0 00:25:34.567 [2024-07-15 17:37:45.696775] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1073ea0 00:25:34.567 [2024-07-15 17:37:45.696816] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:34.567 pt2 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.567 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.827 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:34.827 "name": "raid_bdev1", 00:25:34.827 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:34.827 "strip_size_kb": 0, 00:25:34.827 "state": "online", 00:25:34.827 "raid_level": "raid1", 00:25:34.827 "superblock": true, 00:25:34.827 "num_base_bdevs": 2, 00:25:34.827 "num_base_bdevs_discovered": 2, 00:25:34.827 "num_base_bdevs_operational": 2, 00:25:34.827 "base_bdevs_list": [ 00:25:34.827 { 00:25:34.827 "name": "pt1", 00:25:34.827 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:34.827 "is_configured": true, 00:25:34.827 "data_offset": 256, 00:25:34.827 "data_size": 7936 00:25:34.827 }, 00:25:34.827 { 00:25:34.827 "name": "pt2", 00:25:34.827 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:34.827 "is_configured": true, 00:25:34.827 "data_offset": 256, 00:25:34.827 "data_size": 7936 00:25:34.827 } 00:25:34.827 ] 00:25:34.827 }' 00:25:34.827 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:34.827 17:37:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:35.397 [2024-07-15 17:37:46.606912] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:35.397 "name": "raid_bdev1", 00:25:35.397 "aliases": [ 00:25:35.397 "0c47a2d2-631e-4510-8402-d9dc8e281cca" 00:25:35.397 ], 00:25:35.397 "product_name": "Raid Volume", 00:25:35.397 "block_size": 4128, 00:25:35.397 "num_blocks": 7936, 00:25:35.397 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:35.397 "md_size": 32, 00:25:35.397 "md_interleave": true, 00:25:35.397 "dif_type": 0, 00:25:35.397 "assigned_rate_limits": { 00:25:35.397 "rw_ios_per_sec": 0, 00:25:35.397 "rw_mbytes_per_sec": 0, 00:25:35.397 "r_mbytes_per_sec": 0, 00:25:35.397 "w_mbytes_per_sec": 0 00:25:35.397 }, 00:25:35.397 "claimed": false, 00:25:35.397 "zoned": false, 00:25:35.397 "supported_io_types": { 00:25:35.397 "read": true, 00:25:35.397 "write": true, 00:25:35.397 "unmap": false, 00:25:35.397 "flush": false, 00:25:35.397 "reset": true, 00:25:35.397 "nvme_admin": false, 00:25:35.397 "nvme_io": false, 00:25:35.397 "nvme_io_md": false, 00:25:35.397 "write_zeroes": true, 00:25:35.397 "zcopy": false, 00:25:35.397 "get_zone_info": false, 00:25:35.397 "zone_management": false, 00:25:35.397 "zone_append": false, 00:25:35.397 "compare": false, 00:25:35.397 "compare_and_write": false, 00:25:35.397 "abort": false, 00:25:35.397 "seek_hole": false, 00:25:35.397 "seek_data": false, 00:25:35.397 "copy": false, 00:25:35.397 "nvme_iov_md": false 00:25:35.397 }, 00:25:35.397 "memory_domains": [ 00:25:35.397 { 00:25:35.397 "dma_device_id": "system", 00:25:35.397 "dma_device_type": 1 00:25:35.397 }, 00:25:35.397 { 00:25:35.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:35.397 "dma_device_type": 2 00:25:35.397 }, 00:25:35.397 { 00:25:35.397 "dma_device_id": "system", 00:25:35.397 "dma_device_type": 1 00:25:35.397 }, 00:25:35.397 { 00:25:35.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:35.397 "dma_device_type": 2 00:25:35.397 } 00:25:35.397 ], 00:25:35.397 "driver_specific": { 00:25:35.397 "raid": { 00:25:35.397 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:35.397 "strip_size_kb": 0, 00:25:35.397 "state": "online", 00:25:35.397 "raid_level": "raid1", 00:25:35.397 "superblock": true, 00:25:35.397 "num_base_bdevs": 2, 00:25:35.397 "num_base_bdevs_discovered": 2, 00:25:35.397 "num_base_bdevs_operational": 2, 00:25:35.397 "base_bdevs_list": [ 00:25:35.397 { 00:25:35.397 "name": "pt1", 00:25:35.397 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:35.397 "is_configured": true, 00:25:35.397 "data_offset": 256, 00:25:35.397 "data_size": 7936 00:25:35.397 }, 00:25:35.397 { 00:25:35.397 "name": "pt2", 00:25:35.397 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:35.397 "is_configured": true, 00:25:35.397 "data_offset": 256, 00:25:35.397 "data_size": 7936 00:25:35.397 } 00:25:35.397 ] 00:25:35.397 } 00:25:35.397 } 00:25:35.397 }' 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:35.397 pt2' 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:35.397 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:35.658 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:35.658 "name": "pt1", 00:25:35.658 "aliases": [ 00:25:35.658 "00000000-0000-0000-0000-000000000001" 00:25:35.658 ], 00:25:35.658 "product_name": "passthru", 00:25:35.658 "block_size": 4128, 00:25:35.658 "num_blocks": 8192, 00:25:35.658 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:35.658 "md_size": 32, 00:25:35.658 "md_interleave": true, 00:25:35.658 "dif_type": 0, 00:25:35.658 "assigned_rate_limits": { 00:25:35.658 "rw_ios_per_sec": 0, 00:25:35.658 "rw_mbytes_per_sec": 0, 00:25:35.658 "r_mbytes_per_sec": 0, 00:25:35.658 "w_mbytes_per_sec": 0 00:25:35.658 }, 00:25:35.658 "claimed": true, 00:25:35.658 "claim_type": "exclusive_write", 00:25:35.658 "zoned": false, 00:25:35.658 "supported_io_types": { 00:25:35.658 "read": true, 00:25:35.658 "write": true, 00:25:35.658 "unmap": true, 00:25:35.658 "flush": true, 00:25:35.658 "reset": true, 00:25:35.658 "nvme_admin": false, 00:25:35.658 "nvme_io": false, 00:25:35.658 "nvme_io_md": false, 00:25:35.658 "write_zeroes": true, 00:25:35.658 "zcopy": true, 00:25:35.658 "get_zone_info": false, 00:25:35.658 "zone_management": false, 00:25:35.658 "zone_append": false, 00:25:35.658 "compare": false, 00:25:35.658 "compare_and_write": false, 00:25:35.658 "abort": true, 00:25:35.658 "seek_hole": false, 00:25:35.658 "seek_data": false, 00:25:35.658 "copy": true, 00:25:35.658 "nvme_iov_md": false 00:25:35.658 }, 00:25:35.658 "memory_domains": [ 00:25:35.658 { 00:25:35.658 "dma_device_id": "system", 00:25:35.658 "dma_device_type": 1 00:25:35.658 }, 00:25:35.658 { 00:25:35.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:35.658 "dma_device_type": 2 00:25:35.658 } 00:25:35.658 ], 00:25:35.658 "driver_specific": { 00:25:35.658 "passthru": { 00:25:35.658 "name": "pt1", 00:25:35.658 "base_bdev_name": "malloc1" 00:25:35.658 } 00:25:35.658 } 00:25:35.658 }' 00:25:35.658 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:35.658 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:35.658 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:35.919 17:37:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:35.919 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:36.180 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:36.180 "name": "pt2", 00:25:36.180 "aliases": [ 00:25:36.180 "00000000-0000-0000-0000-000000000002" 00:25:36.180 ], 00:25:36.180 "product_name": "passthru", 00:25:36.180 "block_size": 4128, 00:25:36.180 "num_blocks": 8192, 00:25:36.180 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:36.180 "md_size": 32, 00:25:36.180 "md_interleave": true, 00:25:36.180 "dif_type": 0, 00:25:36.180 "assigned_rate_limits": { 00:25:36.180 "rw_ios_per_sec": 0, 00:25:36.180 "rw_mbytes_per_sec": 0, 00:25:36.180 "r_mbytes_per_sec": 0, 00:25:36.180 "w_mbytes_per_sec": 0 00:25:36.180 }, 00:25:36.180 "claimed": true, 00:25:36.180 "claim_type": "exclusive_write", 00:25:36.180 "zoned": false, 00:25:36.180 "supported_io_types": { 00:25:36.180 "read": true, 00:25:36.180 "write": true, 00:25:36.180 "unmap": true, 00:25:36.180 "flush": true, 00:25:36.180 "reset": true, 00:25:36.180 "nvme_admin": false, 00:25:36.180 "nvme_io": false, 00:25:36.180 "nvme_io_md": false, 00:25:36.180 "write_zeroes": true, 00:25:36.180 "zcopy": true, 00:25:36.180 "get_zone_info": false, 00:25:36.180 "zone_management": false, 00:25:36.180 "zone_append": false, 00:25:36.180 "compare": false, 00:25:36.180 "compare_and_write": false, 00:25:36.180 "abort": true, 00:25:36.180 "seek_hole": false, 00:25:36.180 "seek_data": false, 00:25:36.180 "copy": true, 00:25:36.180 "nvme_iov_md": false 00:25:36.180 }, 00:25:36.180 "memory_domains": [ 00:25:36.180 { 00:25:36.180 "dma_device_id": "system", 00:25:36.180 "dma_device_type": 1 00:25:36.180 }, 00:25:36.180 { 00:25:36.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:36.180 "dma_device_type": 2 00:25:36.180 } 00:25:36.180 ], 00:25:36.180 "driver_specific": { 00:25:36.180 "passthru": { 00:25:36.180 "name": "pt2", 00:25:36.180 "base_bdev_name": "malloc2" 00:25:36.180 } 00:25:36.180 } 00:25:36.180 }' 00:25:36.180 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:36.180 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:36.180 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:25:36.180 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:36.440 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:36.440 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:36.440 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:36.440 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:36.440 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:25:36.440 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:36.440 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:36.700 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:36.700 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:36.700 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:25:36.700 [2024-07-15 17:37:47.922211] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:36.700 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 0c47a2d2-631e-4510-8402-d9dc8e281cca '!=' 0c47a2d2-631e-4510-8402-d9dc8e281cca ']' 00:25:36.701 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:25:36.701 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:36.701 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:25:36.701 17:37:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:36.961 [2024-07-15 17:37:48.114518] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.961 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.222 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:37.222 "name": "raid_bdev1", 00:25:37.222 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:37.222 "strip_size_kb": 0, 00:25:37.222 "state": "online", 00:25:37.222 "raid_level": "raid1", 00:25:37.222 "superblock": true, 00:25:37.222 "num_base_bdevs": 2, 00:25:37.222 "num_base_bdevs_discovered": 1, 00:25:37.222 "num_base_bdevs_operational": 1, 00:25:37.222 "base_bdevs_list": [ 00:25:37.222 { 00:25:37.222 "name": null, 00:25:37.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.222 "is_configured": false, 00:25:37.222 "data_offset": 256, 00:25:37.222 "data_size": 7936 00:25:37.222 }, 00:25:37.222 { 00:25:37.222 "name": "pt2", 00:25:37.222 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:37.222 "is_configured": true, 00:25:37.222 "data_offset": 256, 00:25:37.222 "data_size": 7936 00:25:37.222 } 00:25:37.222 ] 00:25:37.222 }' 00:25:37.222 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:37.222 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:37.793 17:37:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:37.793 [2024-07-15 17:37:49.036834] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:37.793 [2024-07-15 17:37:49.036849] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:37.793 [2024-07-15 17:37:49.036883] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:37.793 [2024-07-15 17:37:49.036915] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:37.793 [2024-07-15 17:37:49.036922] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1073ea0 name raid_bdev1, state offline 00:25:37.793 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.793 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:25:38.053 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:25:38.053 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:25:38.053 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:25:38.053 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:38.053 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:38.313 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:25:38.313 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:38.313 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:25:38.313 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:25:38.313 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:25:38.313 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:38.573 [2024-07-15 17:37:49.614271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:38.573 [2024-07-15 17:37:49.614298] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:38.573 [2024-07-15 17:37:49.614311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xee16d0 00:25:38.573 [2024-07-15 17:37:49.614318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:38.573 [2024-07-15 17:37:49.615440] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:38.573 [2024-07-15 17:37:49.615459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:38.573 [2024-07-15 17:37:49.615490] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:38.573 [2024-07-15 17:37:49.615509] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:38.573 [2024-07-15 17:37:49.615558] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1074fd0 00:25:38.573 [2024-07-15 17:37:49.615564] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:38.573 [2024-07-15 17:37:49.615605] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xee3970 00:25:38.573 [2024-07-15 17:37:49.615660] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1074fd0 00:25:38.573 [2024-07-15 17:37:49.615665] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1074fd0 00:25:38.573 [2024-07-15 17:37:49.615703] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:38.573 pt2 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:38.573 "name": "raid_bdev1", 00:25:38.573 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:38.573 "strip_size_kb": 0, 00:25:38.573 "state": "online", 00:25:38.573 "raid_level": "raid1", 00:25:38.573 "superblock": true, 00:25:38.573 "num_base_bdevs": 2, 00:25:38.573 "num_base_bdevs_discovered": 1, 00:25:38.573 "num_base_bdevs_operational": 1, 00:25:38.573 "base_bdevs_list": [ 00:25:38.573 { 00:25:38.573 "name": null, 00:25:38.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.573 "is_configured": false, 00:25:38.573 "data_offset": 256, 00:25:38.573 "data_size": 7936 00:25:38.573 }, 00:25:38.573 { 00:25:38.573 "name": "pt2", 00:25:38.573 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:38.573 "is_configured": true, 00:25:38.573 "data_offset": 256, 00:25:38.573 "data_size": 7936 00:25:38.573 } 00:25:38.573 ] 00:25:38.573 }' 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:38.573 17:37:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:39.144 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:39.405 [2024-07-15 17:37:50.496509] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:39.405 [2024-07-15 17:37:50.496530] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:39.405 [2024-07-15 17:37:50.496571] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:39.405 [2024-07-15 17:37:50.496600] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:39.405 [2024-07-15 17:37:50.496606] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1074fd0 name raid_bdev1, state offline 00:25:39.405 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.405 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:39.666 [2024-07-15 17:37:50.881462] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:39.666 [2024-07-15 17:37:50.881490] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:39.666 [2024-07-15 17:37:50.881499] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1071530 00:25:39.666 [2024-07-15 17:37:50.881505] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:39.666 [2024-07-15 17:37:50.882632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:39.666 [2024-07-15 17:37:50.882651] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:39.666 [2024-07-15 17:37:50.882684] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:39.666 [2024-07-15 17:37:50.882701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:39.666 [2024-07-15 17:37:50.882767] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:25:39.666 [2024-07-15 17:37:50.882774] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:39.666 [2024-07-15 17:37:50.882783] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1074c20 name raid_bdev1, state configuring 00:25:39.666 [2024-07-15 17:37:50.882797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:39.666 [2024-07-15 17:37:50.882836] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1075820 00:25:39.666 [2024-07-15 17:37:50.882842] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:39.666 [2024-07-15 17:37:50.882880] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xee2fa0 00:25:39.666 [2024-07-15 17:37:50.882935] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1075820 00:25:39.666 [2024-07-15 17:37:50.882940] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1075820 00:25:39.666 [2024-07-15 17:37:50.882986] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:39.666 pt1 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:39.666 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:39.667 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:39.667 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.667 17:37:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.926 17:37:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:39.926 "name": "raid_bdev1", 00:25:39.926 "uuid": "0c47a2d2-631e-4510-8402-d9dc8e281cca", 00:25:39.926 "strip_size_kb": 0, 00:25:39.926 "state": "online", 00:25:39.926 "raid_level": "raid1", 00:25:39.926 "superblock": true, 00:25:39.926 "num_base_bdevs": 2, 00:25:39.926 "num_base_bdevs_discovered": 1, 00:25:39.926 "num_base_bdevs_operational": 1, 00:25:39.926 "base_bdevs_list": [ 00:25:39.926 { 00:25:39.926 "name": null, 00:25:39.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.926 "is_configured": false, 00:25:39.926 "data_offset": 256, 00:25:39.926 "data_size": 7936 00:25:39.926 }, 00:25:39.926 { 00:25:39.926 "name": "pt2", 00:25:39.926 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:39.926 "is_configured": true, 00:25:39.926 "data_offset": 256, 00:25:39.926 "data_size": 7936 00:25:39.926 } 00:25:39.926 ] 00:25:39.926 }' 00:25:39.926 17:37:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:39.926 17:37:51 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:40.497 17:37:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:25:40.497 17:37:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:25:40.757 17:37:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:25:40.757 17:37:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:40.757 17:37:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:25:40.757 [2024-07-15 17:37:52.016519] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:40.757 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 0c47a2d2-631e-4510-8402-d9dc8e281cca '!=' 0c47a2d2-631e-4510-8402-d9dc8e281cca ']' 00:25:40.757 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 2913244 00:25:40.757 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2913244 ']' 00:25:40.757 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2913244 00:25:40.757 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:25:40.757 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:40.757 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2913244 00:25:41.018 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:41.018 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:41.018 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2913244' 00:25:41.018 killing process with pid 2913244 00:25:41.018 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 2913244 00:25:41.018 [2024-07-15 17:37:52.085086] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:41.018 [2024-07-15 17:37:52.085121] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:41.018 [2024-07-15 17:37:52.085150] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:41.018 [2024-07-15 17:37:52.085156] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1075820 name raid_bdev1, state offline 00:25:41.018 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 2913244 00:25:41.018 [2024-07-15 17:37:52.094506] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:41.018 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:25:41.018 00:25:41.018 real 0m13.066s 00:25:41.018 user 0m24.204s 00:25:41.018 sys 0m2.017s 00:25:41.018 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:41.018 17:37:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:41.018 ************************************ 00:25:41.018 END TEST raid_superblock_test_md_interleaved 00:25:41.018 ************************************ 00:25:41.018 17:37:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:41.018 17:37:52 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:25:41.018 17:37:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:41.018 17:37:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:41.018 17:37:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:41.018 ************************************ 00:25:41.018 START TEST raid_rebuild_test_sb_md_interleaved 00:25:41.018 ************************************ 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:41.018 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=2915717 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 2915717 /var/tmp/spdk-raid.sock 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2915717 ']' 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:41.019 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:41.019 17:37:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:41.279 [2024-07-15 17:37:52.350393] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:25:41.279 [2024-07-15 17:37:52.350444] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2915717 ] 00:25:41.279 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:41.279 Zero copy mechanism will not be used. 00:25:41.279 [2024-07-15 17:37:52.439816] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:41.279 [2024-07-15 17:37:52.507991] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:41.279 [2024-07-15 17:37:52.557043] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:41.279 [2024-07-15 17:37:52.557071] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:42.219 17:37:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:42.219 17:37:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:25:42.219 17:37:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:42.219 17:37:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:25:42.219 BaseBdev1_malloc 00:25:42.219 17:37:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:42.478 [2024-07-15 17:37:53.559956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:42.478 [2024-07-15 17:37:53.559989] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:42.478 [2024-07-15 17:37:53.560007] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc6680 00:25:42.478 [2024-07-15 17:37:53.560013] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:42.478 [2024-07-15 17:37:53.561183] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:42.478 [2024-07-15 17:37:53.561202] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:42.478 BaseBdev1 00:25:42.478 17:37:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:42.478 17:37:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:25:42.478 BaseBdev2_malloc 00:25:42.478 17:37:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:42.738 [2024-07-15 17:37:53.926990] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:42.738 [2024-07-15 17:37:53.927018] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:42.738 [2024-07-15 17:37:53.927030] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1146d30 00:25:42.738 [2024-07-15 17:37:53.927036] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:42.738 [2024-07-15 17:37:53.928166] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:42.738 [2024-07-15 17:37:53.928184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:42.738 BaseBdev2 00:25:42.738 17:37:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:25:42.998 spare_malloc 00:25:42.998 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:43.259 spare_delay 00:25:43.259 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:43.259 [2024-07-15 17:37:54.498591] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:43.259 [2024-07-15 17:37:54.498621] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:43.259 [2024-07-15 17:37:54.498633] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1147600 00:25:43.259 [2024-07-15 17:37:54.498640] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:43.259 [2024-07-15 17:37:54.499706] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:43.259 [2024-07-15 17:37:54.499729] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:43.259 spare 00:25:43.259 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:43.519 [2024-07-15 17:37:54.675056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:43.519 [2024-07-15 17:37:54.676045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:43.519 [2024-07-15 17:37:54.676157] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1151e50 00:25:43.519 [2024-07-15 17:37:54.676166] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:43.519 [2024-07-15 17:37:54.676210] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc7110 00:25:43.519 [2024-07-15 17:37:54.676278] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1151e50 00:25:43.519 [2024-07-15 17:37:54.676283] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1151e50 00:25:43.519 [2024-07-15 17:37:54.676325] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:43.519 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:43.519 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:43.519 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:43.519 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:43.519 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:43.519 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:43.519 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:43.519 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:43.520 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:43.520 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:43.520 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.520 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.780 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:43.780 "name": "raid_bdev1", 00:25:43.780 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:43.780 "strip_size_kb": 0, 00:25:43.780 "state": "online", 00:25:43.780 "raid_level": "raid1", 00:25:43.780 "superblock": true, 00:25:43.780 "num_base_bdevs": 2, 00:25:43.780 "num_base_bdevs_discovered": 2, 00:25:43.780 "num_base_bdevs_operational": 2, 00:25:43.780 "base_bdevs_list": [ 00:25:43.780 { 00:25:43.780 "name": "BaseBdev1", 00:25:43.780 "uuid": "0aa83e6f-0a4d-58e7-bc26-706e49c93fff", 00:25:43.780 "is_configured": true, 00:25:43.780 "data_offset": 256, 00:25:43.780 "data_size": 7936 00:25:43.780 }, 00:25:43.780 { 00:25:43.780 "name": "BaseBdev2", 00:25:43.780 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:43.780 "is_configured": true, 00:25:43.780 "data_offset": 256, 00:25:43.780 "data_size": 7936 00:25:43.780 } 00:25:43.780 ] 00:25:43.780 }' 00:25:43.780 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:43.780 17:37:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:44.351 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:44.351 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:44.351 [2024-07-15 17:37:55.585534] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:44.351 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:25:44.351 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.351 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:44.611 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:25:44.611 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:44.611 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:25:44.611 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:44.871 [2024-07-15 17:37:55.958267] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.871 17:37:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.871 17:37:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:44.871 "name": "raid_bdev1", 00:25:44.871 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:44.871 "strip_size_kb": 0, 00:25:44.871 "state": "online", 00:25:44.871 "raid_level": "raid1", 00:25:44.871 "superblock": true, 00:25:44.871 "num_base_bdevs": 2, 00:25:44.871 "num_base_bdevs_discovered": 1, 00:25:44.871 "num_base_bdevs_operational": 1, 00:25:44.871 "base_bdevs_list": [ 00:25:44.872 { 00:25:44.872 "name": null, 00:25:44.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.872 "is_configured": false, 00:25:44.872 "data_offset": 256, 00:25:44.872 "data_size": 7936 00:25:44.872 }, 00:25:44.872 { 00:25:44.872 "name": "BaseBdev2", 00:25:44.872 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:44.872 "is_configured": true, 00:25:44.872 "data_offset": 256, 00:25:44.872 "data_size": 7936 00:25:44.872 } 00:25:44.872 ] 00:25:44.872 }' 00:25:44.872 17:37:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:44.872 17:37:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:45.463 17:37:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:45.722 [2024-07-15 17:37:56.864570] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:45.722 [2024-07-15 17:37:56.867081] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbe980 00:25:45.722 [2024-07-15 17:37:56.868602] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:45.722 17:37:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:46.660 17:37:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:46.660 17:37:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:46.660 17:37:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:46.660 17:37:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:46.660 17:37:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:46.660 17:37:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.660 17:37:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.920 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:46.920 "name": "raid_bdev1", 00:25:46.920 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:46.920 "strip_size_kb": 0, 00:25:46.920 "state": "online", 00:25:46.920 "raid_level": "raid1", 00:25:46.920 "superblock": true, 00:25:46.920 "num_base_bdevs": 2, 00:25:46.920 "num_base_bdevs_discovered": 2, 00:25:46.920 "num_base_bdevs_operational": 2, 00:25:46.920 "process": { 00:25:46.920 "type": "rebuild", 00:25:46.920 "target": "spare", 00:25:46.920 "progress": { 00:25:46.920 "blocks": 2816, 00:25:46.920 "percent": 35 00:25:46.920 } 00:25:46.920 }, 00:25:46.920 "base_bdevs_list": [ 00:25:46.920 { 00:25:46.920 "name": "spare", 00:25:46.920 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:46.920 "is_configured": true, 00:25:46.920 "data_offset": 256, 00:25:46.920 "data_size": 7936 00:25:46.920 }, 00:25:46.920 { 00:25:46.920 "name": "BaseBdev2", 00:25:46.920 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:46.920 "is_configured": true, 00:25:46.920 "data_offset": 256, 00:25:46.920 "data_size": 7936 00:25:46.920 } 00:25:46.920 ] 00:25:46.920 }' 00:25:46.920 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:46.920 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:46.920 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:46.920 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:46.920 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:47.179 [2024-07-15 17:37:58.325316] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:47.179 [2024-07-15 17:37:58.377452] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:47.179 [2024-07-15 17:37:58.377488] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:47.179 [2024-07-15 17:37:58.377498] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:47.179 [2024-07-15 17:37:58.377502] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:47.179 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:47.180 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.180 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.439 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:47.439 "name": "raid_bdev1", 00:25:47.439 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:47.439 "strip_size_kb": 0, 00:25:47.439 "state": "online", 00:25:47.439 "raid_level": "raid1", 00:25:47.439 "superblock": true, 00:25:47.439 "num_base_bdevs": 2, 00:25:47.439 "num_base_bdevs_discovered": 1, 00:25:47.439 "num_base_bdevs_operational": 1, 00:25:47.439 "base_bdevs_list": [ 00:25:47.439 { 00:25:47.439 "name": null, 00:25:47.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:47.439 "is_configured": false, 00:25:47.439 "data_offset": 256, 00:25:47.439 "data_size": 7936 00:25:47.439 }, 00:25:47.439 { 00:25:47.439 "name": "BaseBdev2", 00:25:47.439 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:47.439 "is_configured": true, 00:25:47.439 "data_offset": 256, 00:25:47.439 "data_size": 7936 00:25:47.439 } 00:25:47.439 ] 00:25:47.439 }' 00:25:47.439 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:47.439 17:37:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:48.007 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:48.007 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:48.008 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:48.008 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:48.008 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:48.008 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.008 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.266 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:48.266 "name": "raid_bdev1", 00:25:48.266 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:48.266 "strip_size_kb": 0, 00:25:48.266 "state": "online", 00:25:48.266 "raid_level": "raid1", 00:25:48.266 "superblock": true, 00:25:48.266 "num_base_bdevs": 2, 00:25:48.266 "num_base_bdevs_discovered": 1, 00:25:48.266 "num_base_bdevs_operational": 1, 00:25:48.266 "base_bdevs_list": [ 00:25:48.266 { 00:25:48.266 "name": null, 00:25:48.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.266 "is_configured": false, 00:25:48.266 "data_offset": 256, 00:25:48.266 "data_size": 7936 00:25:48.266 }, 00:25:48.266 { 00:25:48.266 "name": "BaseBdev2", 00:25:48.266 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:48.266 "is_configured": true, 00:25:48.266 "data_offset": 256, 00:25:48.266 "data_size": 7936 00:25:48.266 } 00:25:48.266 ] 00:25:48.266 }' 00:25:48.266 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:48.266 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:48.266 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:48.266 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:48.266 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:48.833 [2024-07-15 17:37:59.953468] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:48.833 [2024-07-15 17:37:59.955981] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbe760 00:25:48.833 [2024-07-15 17:37:59.957115] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:48.833 17:37:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:49.770 17:38:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:49.770 17:38:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:49.770 17:38:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:49.770 17:38:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:49.771 17:38:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:49.771 17:38:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.771 17:38:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:50.031 "name": "raid_bdev1", 00:25:50.031 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:50.031 "strip_size_kb": 0, 00:25:50.031 "state": "online", 00:25:50.031 "raid_level": "raid1", 00:25:50.031 "superblock": true, 00:25:50.031 "num_base_bdevs": 2, 00:25:50.031 "num_base_bdevs_discovered": 2, 00:25:50.031 "num_base_bdevs_operational": 2, 00:25:50.031 "process": { 00:25:50.031 "type": "rebuild", 00:25:50.031 "target": "spare", 00:25:50.031 "progress": { 00:25:50.031 "blocks": 3072, 00:25:50.031 "percent": 38 00:25:50.031 } 00:25:50.031 }, 00:25:50.031 "base_bdevs_list": [ 00:25:50.031 { 00:25:50.031 "name": "spare", 00:25:50.031 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:50.031 "is_configured": true, 00:25:50.031 "data_offset": 256, 00:25:50.031 "data_size": 7936 00:25:50.031 }, 00:25:50.031 { 00:25:50.031 "name": "BaseBdev2", 00:25:50.031 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:50.031 "is_configured": true, 00:25:50.031 "data_offset": 256, 00:25:50.031 "data_size": 7936 00:25:50.031 } 00:25:50.031 ] 00:25:50.031 }' 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:50.031 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=975 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.031 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.292 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:50.292 "name": "raid_bdev1", 00:25:50.292 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:50.292 "strip_size_kb": 0, 00:25:50.292 "state": "online", 00:25:50.292 "raid_level": "raid1", 00:25:50.292 "superblock": true, 00:25:50.292 "num_base_bdevs": 2, 00:25:50.292 "num_base_bdevs_discovered": 2, 00:25:50.292 "num_base_bdevs_operational": 2, 00:25:50.292 "process": { 00:25:50.292 "type": "rebuild", 00:25:50.292 "target": "spare", 00:25:50.292 "progress": { 00:25:50.292 "blocks": 3584, 00:25:50.292 "percent": 45 00:25:50.292 } 00:25:50.292 }, 00:25:50.292 "base_bdevs_list": [ 00:25:50.292 { 00:25:50.292 "name": "spare", 00:25:50.292 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:50.292 "is_configured": true, 00:25:50.292 "data_offset": 256, 00:25:50.292 "data_size": 7936 00:25:50.292 }, 00:25:50.292 { 00:25:50.292 "name": "BaseBdev2", 00:25:50.292 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:50.292 "is_configured": true, 00:25:50.292 "data_offset": 256, 00:25:50.292 "data_size": 7936 00:25:50.292 } 00:25:50.292 ] 00:25:50.292 }' 00:25:50.292 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:50.292 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:50.292 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:50.292 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:50.292 17:38:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:51.706 "name": "raid_bdev1", 00:25:51.706 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:51.706 "strip_size_kb": 0, 00:25:51.706 "state": "online", 00:25:51.706 "raid_level": "raid1", 00:25:51.706 "superblock": true, 00:25:51.706 "num_base_bdevs": 2, 00:25:51.706 "num_base_bdevs_discovered": 2, 00:25:51.706 "num_base_bdevs_operational": 2, 00:25:51.706 "process": { 00:25:51.706 "type": "rebuild", 00:25:51.706 "target": "spare", 00:25:51.706 "progress": { 00:25:51.706 "blocks": 6912, 00:25:51.706 "percent": 87 00:25:51.706 } 00:25:51.706 }, 00:25:51.706 "base_bdevs_list": [ 00:25:51.706 { 00:25:51.706 "name": "spare", 00:25:51.706 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:51.706 "is_configured": true, 00:25:51.706 "data_offset": 256, 00:25:51.706 "data_size": 7936 00:25:51.706 }, 00:25:51.706 { 00:25:51.706 "name": "BaseBdev2", 00:25:51.706 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:51.706 "is_configured": true, 00:25:51.706 "data_offset": 256, 00:25:51.706 "data_size": 7936 00:25:51.706 } 00:25:51.706 ] 00:25:51.706 }' 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:51.706 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:51.707 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:51.707 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:51.707 17:38:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:51.967 [2024-07-15 17:38:03.075270] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:51.967 [2024-07-15 17:38:03.075314] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:51.967 [2024-07-15 17:38:03.075377] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:52.909 17:38:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:52.909 17:38:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:52.909 17:38:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:52.909 17:38:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:52.909 17:38:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:52.909 17:38:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:52.909 17:38:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.909 17:38:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:52.909 "name": "raid_bdev1", 00:25:52.909 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:52.909 "strip_size_kb": 0, 00:25:52.909 "state": "online", 00:25:52.909 "raid_level": "raid1", 00:25:52.909 "superblock": true, 00:25:52.909 "num_base_bdevs": 2, 00:25:52.909 "num_base_bdevs_discovered": 2, 00:25:52.909 "num_base_bdevs_operational": 2, 00:25:52.909 "base_bdevs_list": [ 00:25:52.909 { 00:25:52.909 "name": "spare", 00:25:52.909 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:52.909 "is_configured": true, 00:25:52.909 "data_offset": 256, 00:25:52.909 "data_size": 7936 00:25:52.909 }, 00:25:52.909 { 00:25:52.909 "name": "BaseBdev2", 00:25:52.909 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:52.909 "is_configured": true, 00:25:52.909 "data_offset": 256, 00:25:52.909 "data_size": 7936 00:25:52.909 } 00:25:52.909 ] 00:25:52.909 }' 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.909 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:53.170 "name": "raid_bdev1", 00:25:53.170 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:53.170 "strip_size_kb": 0, 00:25:53.170 "state": "online", 00:25:53.170 "raid_level": "raid1", 00:25:53.170 "superblock": true, 00:25:53.170 "num_base_bdevs": 2, 00:25:53.170 "num_base_bdevs_discovered": 2, 00:25:53.170 "num_base_bdevs_operational": 2, 00:25:53.170 "base_bdevs_list": [ 00:25:53.170 { 00:25:53.170 "name": "spare", 00:25:53.170 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:53.170 "is_configured": true, 00:25:53.170 "data_offset": 256, 00:25:53.170 "data_size": 7936 00:25:53.170 }, 00:25:53.170 { 00:25:53.170 "name": "BaseBdev2", 00:25:53.170 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:53.170 "is_configured": true, 00:25:53.170 "data_offset": 256, 00:25:53.170 "data_size": 7936 00:25:53.170 } 00:25:53.170 ] 00:25:53.170 }' 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.170 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.432 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.432 "name": "raid_bdev1", 00:25:53.432 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:53.432 "strip_size_kb": 0, 00:25:53.432 "state": "online", 00:25:53.432 "raid_level": "raid1", 00:25:53.432 "superblock": true, 00:25:53.432 "num_base_bdevs": 2, 00:25:53.432 "num_base_bdevs_discovered": 2, 00:25:53.432 "num_base_bdevs_operational": 2, 00:25:53.432 "base_bdevs_list": [ 00:25:53.432 { 00:25:53.432 "name": "spare", 00:25:53.432 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:53.432 "is_configured": true, 00:25:53.432 "data_offset": 256, 00:25:53.432 "data_size": 7936 00:25:53.432 }, 00:25:53.432 { 00:25:53.432 "name": "BaseBdev2", 00:25:53.432 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:53.432 "is_configured": true, 00:25:53.432 "data_offset": 256, 00:25:53.432 "data_size": 7936 00:25:53.432 } 00:25:53.432 ] 00:25:53.432 }' 00:25:53.432 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.432 17:38:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:54.002 17:38:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:54.263 [2024-07-15 17:38:05.341285] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:54.263 [2024-07-15 17:38:05.341303] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:54.263 [2024-07-15 17:38:05.341344] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:54.263 [2024-07-15 17:38:05.341385] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:54.263 [2024-07-15 17:38:05.341392] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1151e50 name raid_bdev1, state offline 00:25:54.263 17:38:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.263 17:38:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:25:54.263 17:38:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:54.263 17:38:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:25:54.263 17:38:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:54.263 17:38:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:54.833 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:55.094 [2024-07-15 17:38:06.275594] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:55.094 [2024-07-15 17:38:06.275623] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:55.094 [2024-07-15 17:38:06.275634] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1151bc0 00:25:55.094 [2024-07-15 17:38:06.275640] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:55.094 [2024-07-15 17:38:06.276977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:55.094 [2024-07-15 17:38:06.276997] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:55.094 [2024-07-15 17:38:06.277041] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:55.094 [2024-07-15 17:38:06.277061] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:55.094 [2024-07-15 17:38:06.277124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:55.094 spare 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.094 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.094 [2024-07-15 17:38:06.377409] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11539f0 00:25:55.094 [2024-07-15 17:38:06.377417] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:55.094 [2024-07-15 17:38:06.377464] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbe780 00:25:55.094 [2024-07-15 17:38:06.377529] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11539f0 00:25:55.094 [2024-07-15 17:38:06.377534] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11539f0 00:25:55.094 [2024-07-15 17:38:06.377579] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:55.354 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.354 "name": "raid_bdev1", 00:25:55.354 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:55.354 "strip_size_kb": 0, 00:25:55.354 "state": "online", 00:25:55.354 "raid_level": "raid1", 00:25:55.354 "superblock": true, 00:25:55.354 "num_base_bdevs": 2, 00:25:55.354 "num_base_bdevs_discovered": 2, 00:25:55.354 "num_base_bdevs_operational": 2, 00:25:55.354 "base_bdevs_list": [ 00:25:55.354 { 00:25:55.354 "name": "spare", 00:25:55.354 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:55.354 "is_configured": true, 00:25:55.354 "data_offset": 256, 00:25:55.354 "data_size": 7936 00:25:55.354 }, 00:25:55.354 { 00:25:55.354 "name": "BaseBdev2", 00:25:55.354 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:55.354 "is_configured": true, 00:25:55.354 "data_offset": 256, 00:25:55.354 "data_size": 7936 00:25:55.354 } 00:25:55.354 ] 00:25:55.354 }' 00:25:55.354 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.354 17:38:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:55.961 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:55.961 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.961 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:55.961 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:55.961 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.961 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.961 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.961 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:55.961 "name": "raid_bdev1", 00:25:55.961 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:55.961 "strip_size_kb": 0, 00:25:55.961 "state": "online", 00:25:55.961 "raid_level": "raid1", 00:25:55.961 "superblock": true, 00:25:55.961 "num_base_bdevs": 2, 00:25:55.961 "num_base_bdevs_discovered": 2, 00:25:55.961 "num_base_bdevs_operational": 2, 00:25:55.961 "base_bdevs_list": [ 00:25:55.961 { 00:25:55.961 "name": "spare", 00:25:55.961 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:55.961 "is_configured": true, 00:25:55.961 "data_offset": 256, 00:25:55.961 "data_size": 7936 00:25:55.961 }, 00:25:55.961 { 00:25:55.961 "name": "BaseBdev2", 00:25:55.961 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:55.961 "is_configured": true, 00:25:55.961 "data_offset": 256, 00:25:55.961 "data_size": 7936 00:25:55.961 } 00:25:55.961 ] 00:25:55.961 }' 00:25:55.961 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.229 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:56.229 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:56.229 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:56.229 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.229 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:56.229 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:56.229 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:56.489 [2024-07-15 17:38:07.667205] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.489 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.749 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:56.749 "name": "raid_bdev1", 00:25:56.749 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:56.749 "strip_size_kb": 0, 00:25:56.749 "state": "online", 00:25:56.749 "raid_level": "raid1", 00:25:56.749 "superblock": true, 00:25:56.749 "num_base_bdevs": 2, 00:25:56.749 "num_base_bdevs_discovered": 1, 00:25:56.749 "num_base_bdevs_operational": 1, 00:25:56.749 "base_bdevs_list": [ 00:25:56.749 { 00:25:56.749 "name": null, 00:25:56.749 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.749 "is_configured": false, 00:25:56.749 "data_offset": 256, 00:25:56.749 "data_size": 7936 00:25:56.749 }, 00:25:56.749 { 00:25:56.749 "name": "BaseBdev2", 00:25:56.749 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:56.749 "is_configured": true, 00:25:56.749 "data_offset": 256, 00:25:56.749 "data_size": 7936 00:25:56.749 } 00:25:56.749 ] 00:25:56.749 }' 00:25:56.749 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:56.749 17:38:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:57.320 17:38:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:57.890 [2024-07-15 17:38:08.902347] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:57.890 [2024-07-15 17:38:08.902453] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:57.890 [2024-07-15 17:38:08.902463] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:57.890 [2024-07-15 17:38:08.902481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:57.890 [2024-07-15 17:38:08.905064] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbd640 00:25:57.890 [2024-07-15 17:38:08.906603] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:57.890 17:38:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:58.829 17:38:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:58.829 17:38:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:58.829 17:38:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:58.829 17:38:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:58.829 17:38:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:58.829 17:38:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.829 17:38:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.089 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:59.089 "name": "raid_bdev1", 00:25:59.089 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:59.089 "strip_size_kb": 0, 00:25:59.089 "state": "online", 00:25:59.089 "raid_level": "raid1", 00:25:59.089 "superblock": true, 00:25:59.089 "num_base_bdevs": 2, 00:25:59.089 "num_base_bdevs_discovered": 2, 00:25:59.089 "num_base_bdevs_operational": 2, 00:25:59.089 "process": { 00:25:59.089 "type": "rebuild", 00:25:59.089 "target": "spare", 00:25:59.089 "progress": { 00:25:59.089 "blocks": 3072, 00:25:59.089 "percent": 38 00:25:59.089 } 00:25:59.089 }, 00:25:59.089 "base_bdevs_list": [ 00:25:59.089 { 00:25:59.089 "name": "spare", 00:25:59.089 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:25:59.089 "is_configured": true, 00:25:59.089 "data_offset": 256, 00:25:59.089 "data_size": 7936 00:25:59.089 }, 00:25:59.089 { 00:25:59.089 "name": "BaseBdev2", 00:25:59.089 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:59.089 "is_configured": true, 00:25:59.089 "data_offset": 256, 00:25:59.089 "data_size": 7936 00:25:59.089 } 00:25:59.089 ] 00:25:59.089 }' 00:25:59.089 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:59.089 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:59.089 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:59.089 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:59.089 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:59.349 [2024-07-15 17:38:10.399507] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:59.349 [2024-07-15 17:38:10.415505] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:59.349 [2024-07-15 17:38:10.415543] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:59.349 [2024-07-15 17:38:10.415552] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:59.349 [2024-07-15 17:38:10.415557] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:59.349 "name": "raid_bdev1", 00:25:59.349 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:25:59.349 "strip_size_kb": 0, 00:25:59.349 "state": "online", 00:25:59.349 "raid_level": "raid1", 00:25:59.349 "superblock": true, 00:25:59.349 "num_base_bdevs": 2, 00:25:59.349 "num_base_bdevs_discovered": 1, 00:25:59.349 "num_base_bdevs_operational": 1, 00:25:59.349 "base_bdevs_list": [ 00:25:59.349 { 00:25:59.349 "name": null, 00:25:59.349 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.349 "is_configured": false, 00:25:59.349 "data_offset": 256, 00:25:59.349 "data_size": 7936 00:25:59.349 }, 00:25:59.349 { 00:25:59.349 "name": "BaseBdev2", 00:25:59.349 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:25:59.349 "is_configured": true, 00:25:59.349 "data_offset": 256, 00:25:59.349 "data_size": 7936 00:25:59.349 } 00:25:59.349 ] 00:25:59.349 }' 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:59.349 17:38:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:59.919 17:38:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:00.179 [2024-07-15 17:38:11.321809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:00.179 [2024-07-15 17:38:11.321838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:00.179 [2024-07-15 17:38:11.321852] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfbd440 00:26:00.179 [2024-07-15 17:38:11.321859] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:00.179 [2024-07-15 17:38:11.322000] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:00.179 [2024-07-15 17:38:11.322010] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:00.179 [2024-07-15 17:38:11.322046] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:00.179 [2024-07-15 17:38:11.322053] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:00.179 [2024-07-15 17:38:11.322058] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:00.179 [2024-07-15 17:38:11.322073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:00.179 [2024-07-15 17:38:11.324473] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbe430 00:26:00.179 [2024-07-15 17:38:11.325601] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:00.179 spare 00:26:00.179 17:38:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:01.121 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:01.121 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:01.121 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:01.121 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:01.121 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:01.121 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.121 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.382 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:01.382 "name": "raid_bdev1", 00:26:01.382 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:26:01.382 "strip_size_kb": 0, 00:26:01.382 "state": "online", 00:26:01.382 "raid_level": "raid1", 00:26:01.382 "superblock": true, 00:26:01.382 "num_base_bdevs": 2, 00:26:01.382 "num_base_bdevs_discovered": 2, 00:26:01.382 "num_base_bdevs_operational": 2, 00:26:01.382 "process": { 00:26:01.382 "type": "rebuild", 00:26:01.382 "target": "spare", 00:26:01.382 "progress": { 00:26:01.382 "blocks": 2816, 00:26:01.382 "percent": 35 00:26:01.382 } 00:26:01.382 }, 00:26:01.382 "base_bdevs_list": [ 00:26:01.382 { 00:26:01.382 "name": "spare", 00:26:01.382 "uuid": "228d87ec-c7d3-50af-8e06-587d4b426599", 00:26:01.382 "is_configured": true, 00:26:01.382 "data_offset": 256, 00:26:01.382 "data_size": 7936 00:26:01.382 }, 00:26:01.382 { 00:26:01.382 "name": "BaseBdev2", 00:26:01.382 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:26:01.382 "is_configured": true, 00:26:01.382 "data_offset": 256, 00:26:01.382 "data_size": 7936 00:26:01.382 } 00:26:01.382 ] 00:26:01.382 }' 00:26:01.382 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:01.382 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:01.382 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:01.382 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:01.382 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:01.642 [2024-07-15 17:38:12.806458] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:01.642 [2024-07-15 17:38:12.834490] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:01.642 [2024-07-15 17:38:12.834521] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:01.642 [2024-07-15 17:38:12.834530] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:01.642 [2024-07-15 17:38:12.834535] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:01.642 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:01.642 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:01.642 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.642 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:01.642 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:01.642 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:01.643 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.643 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.643 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.643 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.643 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.643 17:38:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.903 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.903 "name": "raid_bdev1", 00:26:01.903 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:26:01.903 "strip_size_kb": 0, 00:26:01.903 "state": "online", 00:26:01.903 "raid_level": "raid1", 00:26:01.903 "superblock": true, 00:26:01.903 "num_base_bdevs": 2, 00:26:01.903 "num_base_bdevs_discovered": 1, 00:26:01.903 "num_base_bdevs_operational": 1, 00:26:01.903 "base_bdevs_list": [ 00:26:01.903 { 00:26:01.903 "name": null, 00:26:01.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.903 "is_configured": false, 00:26:01.903 "data_offset": 256, 00:26:01.903 "data_size": 7936 00:26:01.903 }, 00:26:01.903 { 00:26:01.903 "name": "BaseBdev2", 00:26:01.903 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:26:01.903 "is_configured": true, 00:26:01.903 "data_offset": 256, 00:26:01.903 "data_size": 7936 00:26:01.903 } 00:26:01.903 ] 00:26:01.903 }' 00:26:01.903 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.903 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:02.472 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:02.472 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:02.472 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:02.472 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:02.472 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:02.472 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.472 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.733 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:02.733 "name": "raid_bdev1", 00:26:02.733 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:26:02.733 "strip_size_kb": 0, 00:26:02.733 "state": "online", 00:26:02.733 "raid_level": "raid1", 00:26:02.733 "superblock": true, 00:26:02.733 "num_base_bdevs": 2, 00:26:02.733 "num_base_bdevs_discovered": 1, 00:26:02.733 "num_base_bdevs_operational": 1, 00:26:02.733 "base_bdevs_list": [ 00:26:02.733 { 00:26:02.733 "name": null, 00:26:02.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.733 "is_configured": false, 00:26:02.733 "data_offset": 256, 00:26:02.733 "data_size": 7936 00:26:02.733 }, 00:26:02.733 { 00:26:02.733 "name": "BaseBdev2", 00:26:02.733 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:26:02.733 "is_configured": true, 00:26:02.733 "data_offset": 256, 00:26:02.733 "data_size": 7936 00:26:02.733 } 00:26:02.733 ] 00:26:02.733 }' 00:26:02.733 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:02.733 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:02.733 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:02.733 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:02.733 17:38:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:02.993 17:38:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:02.993 [2024-07-15 17:38:14.270149] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:02.993 [2024-07-15 17:38:14.270178] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:02.993 [2024-07-15 17:38:14.270193] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc1850 00:26:02.993 [2024-07-15 17:38:14.270199] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:02.993 [2024-07-15 17:38:14.270318] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:02.993 [2024-07-15 17:38:14.270327] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:02.993 [2024-07-15 17:38:14.270356] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:02.993 [2024-07-15 17:38:14.270362] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:02.993 [2024-07-15 17:38:14.270367] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:02.993 BaseBdev1 00:26:02.993 17:38:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:04.376 "name": "raid_bdev1", 00:26:04.376 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:26:04.376 "strip_size_kb": 0, 00:26:04.376 "state": "online", 00:26:04.376 "raid_level": "raid1", 00:26:04.376 "superblock": true, 00:26:04.376 "num_base_bdevs": 2, 00:26:04.376 "num_base_bdevs_discovered": 1, 00:26:04.376 "num_base_bdevs_operational": 1, 00:26:04.376 "base_bdevs_list": [ 00:26:04.376 { 00:26:04.376 "name": null, 00:26:04.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.376 "is_configured": false, 00:26:04.376 "data_offset": 256, 00:26:04.376 "data_size": 7936 00:26:04.376 }, 00:26:04.376 { 00:26:04.376 "name": "BaseBdev2", 00:26:04.376 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:26:04.376 "is_configured": true, 00:26:04.376 "data_offset": 256, 00:26:04.376 "data_size": 7936 00:26:04.376 } 00:26:04.376 ] 00:26:04.376 }' 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:04.376 17:38:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:04.946 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:04.946 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.946 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:04.946 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:04.946 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.946 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.946 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.946 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.946 "name": "raid_bdev1", 00:26:04.946 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:26:04.946 "strip_size_kb": 0, 00:26:04.946 "state": "online", 00:26:04.946 "raid_level": "raid1", 00:26:04.946 "superblock": true, 00:26:04.946 "num_base_bdevs": 2, 00:26:04.946 "num_base_bdevs_discovered": 1, 00:26:04.946 "num_base_bdevs_operational": 1, 00:26:04.946 "base_bdevs_list": [ 00:26:04.946 { 00:26:04.946 "name": null, 00:26:04.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.946 "is_configured": false, 00:26:04.946 "data_offset": 256, 00:26:04.946 "data_size": 7936 00:26:04.946 }, 00:26:04.946 { 00:26:04.946 "name": "BaseBdev2", 00:26:04.946 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:26:04.946 "is_configured": true, 00:26:04.946 "data_offset": 256, 00:26:04.946 "data_size": 7936 00:26:04.946 } 00:26:04.946 ] 00:26:04.946 }' 00:26:04.946 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:05.207 [2024-07-15 17:38:16.483752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:05.207 [2024-07-15 17:38:16.483835] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:05.207 [2024-07-15 17:38:16.483843] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:05.207 request: 00:26:05.207 { 00:26:05.207 "base_bdev": "BaseBdev1", 00:26:05.207 "raid_bdev": "raid_bdev1", 00:26:05.207 "method": "bdev_raid_add_base_bdev", 00:26:05.207 "req_id": 1 00:26:05.207 } 00:26:05.207 Got JSON-RPC error response 00:26:05.207 response: 00:26:05.207 { 00:26:05.207 "code": -22, 00:26:05.207 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:05.207 } 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:05.207 17:38:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:06.590 "name": "raid_bdev1", 00:26:06.590 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:26:06.590 "strip_size_kb": 0, 00:26:06.590 "state": "online", 00:26:06.590 "raid_level": "raid1", 00:26:06.590 "superblock": true, 00:26:06.590 "num_base_bdevs": 2, 00:26:06.590 "num_base_bdevs_discovered": 1, 00:26:06.590 "num_base_bdevs_operational": 1, 00:26:06.590 "base_bdevs_list": [ 00:26:06.590 { 00:26:06.590 "name": null, 00:26:06.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:06.590 "is_configured": false, 00:26:06.590 "data_offset": 256, 00:26:06.590 "data_size": 7936 00:26:06.590 }, 00:26:06.590 { 00:26:06.590 "name": "BaseBdev2", 00:26:06.590 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:26:06.590 "is_configured": true, 00:26:06.590 "data_offset": 256, 00:26:06.590 "data_size": 7936 00:26:06.590 } 00:26:06.590 ] 00:26:06.590 }' 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:06.590 17:38:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:07.159 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:07.159 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.159 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:07.159 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:07.159 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.159 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.159 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.159 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.159 "name": "raid_bdev1", 00:26:07.159 "uuid": "df5cb73a-93d5-42c7-b11f-1ff169c24ad1", 00:26:07.159 "strip_size_kb": 0, 00:26:07.159 "state": "online", 00:26:07.159 "raid_level": "raid1", 00:26:07.159 "superblock": true, 00:26:07.159 "num_base_bdevs": 2, 00:26:07.159 "num_base_bdevs_discovered": 1, 00:26:07.159 "num_base_bdevs_operational": 1, 00:26:07.159 "base_bdevs_list": [ 00:26:07.159 { 00:26:07.159 "name": null, 00:26:07.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.159 "is_configured": false, 00:26:07.159 "data_offset": 256, 00:26:07.159 "data_size": 7936 00:26:07.159 }, 00:26:07.159 { 00:26:07.159 "name": "BaseBdev2", 00:26:07.159 "uuid": "9c1b9523-ab2f-505e-af28-dabc07065404", 00:26:07.159 "is_configured": true, 00:26:07.159 "data_offset": 256, 00:26:07.159 "data_size": 7936 00:26:07.159 } 00:26:07.159 ] 00:26:07.159 }' 00:26:07.159 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 2915717 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2915717 ']' 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2915717 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2915717 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2915717' 00:26:07.418 killing process with pid 2915717 00:26:07.418 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2915717 00:26:07.418 Received shutdown signal, test time was about 60.000000 seconds 00:26:07.418 00:26:07.418 Latency(us) 00:26:07.418 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:07.419 =================================================================================================================== 00:26:07.419 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:07.419 [2024-07-15 17:38:18.583173] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:07.419 [2024-07-15 17:38:18.583235] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:07.419 [2024-07-15 17:38:18.583265] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:07.419 [2024-07-15 17:38:18.583272] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11539f0 name raid_bdev1, state offline 00:26:07.419 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2915717 00:26:07.419 [2024-07-15 17:38:18.598640] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:07.419 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:26:07.419 00:26:07.419 real 0m26.427s 00:26:07.419 user 0m42.346s 00:26:07.419 sys 0m2.742s 00:26:07.678 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:07.678 17:38:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:07.678 ************************************ 00:26:07.678 END TEST raid_rebuild_test_sb_md_interleaved 00:26:07.678 ************************************ 00:26:07.678 17:38:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:07.678 17:38:18 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:26:07.678 17:38:18 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:26:07.678 17:38:18 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 2915717 ']' 00:26:07.678 17:38:18 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 2915717 00:26:07.678 17:38:18 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:26:07.678 00:26:07.678 real 16m2.582s 00:26:07.678 user 27m34.137s 00:26:07.678 sys 2m21.487s 00:26:07.678 17:38:18 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:07.678 17:38:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:07.678 ************************************ 00:26:07.678 END TEST bdev_raid 00:26:07.678 ************************************ 00:26:07.678 17:38:18 -- common/autotest_common.sh@1142 -- # return 0 00:26:07.678 17:38:18 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:26:07.678 17:38:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:26:07.678 17:38:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:07.678 17:38:18 -- common/autotest_common.sh@10 -- # set +x 00:26:07.678 ************************************ 00:26:07.678 START TEST bdevperf_config 00:26:07.678 ************************************ 00:26:07.678 17:38:18 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:26:07.938 * Looking for test storage... 00:26:07.938 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:26:07.938 17:38:18 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:07.938 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:07.938 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:07.938 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:07.938 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:07.938 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:07.938 17:38:19 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:10.478 17:38:21 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-15 17:38:19.087558] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:10.478 [2024-07-15 17:38:19.087622] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2920717 ] 00:26:10.478 Using job config with 4 jobs 00:26:10.478 [2024-07-15 17:38:19.206364] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:10.478 [2024-07-15 17:38:19.280244] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:10.478 cpumask for '\''job0'\'' is too big 00:26:10.478 cpumask for '\''job1'\'' is too big 00:26:10.478 cpumask for '\''job2'\'' is too big 00:26:10.478 cpumask for '\''job3'\'' is too big 00:26:10.478 Running I/O for 2 seconds... 00:26:10.478 00:26:10.478 Latency(us) 00:26:10.478 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.01 27829.36 27.18 0.00 0.00 9194.64 1606.89 14115.45 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.02 27807.55 27.16 0.00 0.00 9183.55 1606.89 12502.25 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.02 27785.85 27.13 0.00 0.00 9172.92 1594.29 10989.88 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.02 27858.82 27.21 0.00 0.00 9131.64 793.99 10132.87 00:26:10.478 =================================================================================================================== 00:26:10.478 Total : 111281.58 108.67 0.00 0.00 9170.64 793.99 14115.45' 00:26:10.478 17:38:21 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-15 17:38:19.087558] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:10.478 [2024-07-15 17:38:19.087622] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2920717 ] 00:26:10.478 Using job config with 4 jobs 00:26:10.478 [2024-07-15 17:38:19.206364] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:10.478 [2024-07-15 17:38:19.280244] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:10.478 cpumask for '\''job0'\'' is too big 00:26:10.478 cpumask for '\''job1'\'' is too big 00:26:10.478 cpumask for '\''job2'\'' is too big 00:26:10.478 cpumask for '\''job3'\'' is too big 00:26:10.478 Running I/O for 2 seconds... 00:26:10.478 00:26:10.478 Latency(us) 00:26:10.478 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.01 27829.36 27.18 0.00 0.00 9194.64 1606.89 14115.45 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.02 27807.55 27.16 0.00 0.00 9183.55 1606.89 12502.25 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.02 27785.85 27.13 0.00 0.00 9172.92 1594.29 10989.88 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.02 27858.82 27.21 0.00 0.00 9131.64 793.99 10132.87 00:26:10.478 =================================================================================================================== 00:26:10.478 Total : 111281.58 108.67 0.00 0.00 9170.64 793.99 14115.45' 00:26:10.478 17:38:21 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 17:38:19.087558] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:10.478 [2024-07-15 17:38:19.087622] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2920717 ] 00:26:10.478 Using job config with 4 jobs 00:26:10.478 [2024-07-15 17:38:19.206364] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:10.478 [2024-07-15 17:38:19.280244] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:10.478 cpumask for '\''job0'\'' is too big 00:26:10.478 cpumask for '\''job1'\'' is too big 00:26:10.478 cpumask for '\''job2'\'' is too big 00:26:10.478 cpumask for '\''job3'\'' is too big 00:26:10.478 Running I/O for 2 seconds... 00:26:10.478 00:26:10.478 Latency(us) 00:26:10.478 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.01 27829.36 27.18 0.00 0.00 9194.64 1606.89 14115.45 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.02 27807.55 27.16 0.00 0.00 9183.55 1606.89 12502.25 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.02 27785.85 27.13 0.00 0.00 9172.92 1594.29 10989.88 00:26:10.478 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:10.478 Malloc0 : 2.02 27858.82 27.21 0.00 0.00 9131.64 793.99 10132.87 00:26:10.478 =================================================================================================================== 00:26:10.478 Total : 111281.58 108.67 0.00 0.00 9170.64 793.99 14115.45' 00:26:10.478 17:38:21 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:26:10.478 17:38:21 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:26:10.478 17:38:21 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:26:10.478 17:38:21 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:10.478 [2024-07-15 17:38:21.604510] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:10.478 [2024-07-15 17:38:21.604562] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2921040 ] 00:26:10.478 [2024-07-15 17:38:21.700287] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:10.478 [2024-07-15 17:38:21.771979] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:10.765 cpumask for 'job0' is too big 00:26:10.765 cpumask for 'job1' is too big 00:26:10.765 cpumask for 'job2' is too big 00:26:10.765 cpumask for 'job3' is too big 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:26:13.312 Running I/O for 2 seconds... 00:26:13.312 00:26:13.312 Latency(us) 00:26:13.312 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:13.312 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:13.312 Malloc0 : 2.02 27933.50 27.28 0.00 0.00 9150.76 1625.80 14115.45 00:26:13.312 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:13.312 Malloc0 : 2.02 27911.52 27.26 0.00 0.00 9139.64 1613.19 12451.84 00:26:13.312 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:13.312 Malloc0 : 2.02 27889.73 27.24 0.00 0.00 9129.87 1587.99 10889.06 00:26:13.312 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:13.312 Malloc0 : 2.02 27867.93 27.21 0.00 0.00 9119.09 1594.29 9427.10 00:26:13.312 =================================================================================================================== 00:26:13.312 Total : 111602.68 108.99 0.00 0.00 9134.84 1587.99 14115.45' 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:13.312 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:13.312 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:13.312 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:13.312 17:38:24 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:15.853 17:38:26 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-15 17:38:24.123662] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:15.853 [2024-07-15 17:38:24.123722] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2921448 ] 00:26:15.853 Using job config with 3 jobs 00:26:15.853 [2024-07-15 17:38:24.232129] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:15.853 [2024-07-15 17:38:24.320536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:15.853 cpumask for '\''job0'\'' is too big 00:26:15.853 cpumask for '\''job1'\'' is too big 00:26:15.853 cpumask for '\''job2'\'' is too big 00:26:15.853 Running I/O for 2 seconds... 00:26:15.853 00:26:15.853 Latency(us) 00:26:15.853 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:15.853 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:15.853 Malloc0 : 2.01 37910.72 37.02 0.00 0.00 6741.15 1575.38 9931.22 00:26:15.853 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:15.853 Malloc0 : 2.01 37880.82 36.99 0.00 0.00 6733.62 1537.58 8368.44 00:26:15.853 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:15.853 Malloc0 : 2.02 37851.05 36.96 0.00 0.00 6726.39 1537.58 7007.31 00:26:15.853 =================================================================================================================== 00:26:15.853 Total : 113642.59 110.98 0.00 0.00 6733.72 1537.58 9931.22' 00:26:15.853 17:38:26 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-15 17:38:24.123662] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:15.853 [2024-07-15 17:38:24.123722] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2921448 ] 00:26:15.853 Using job config with 3 jobs 00:26:15.853 [2024-07-15 17:38:24.232129] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:15.853 [2024-07-15 17:38:24.320536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:15.853 cpumask for '\''job0'\'' is too big 00:26:15.853 cpumask for '\''job1'\'' is too big 00:26:15.853 cpumask for '\''job2'\'' is too big 00:26:15.853 Running I/O for 2 seconds... 00:26:15.853 00:26:15.853 Latency(us) 00:26:15.853 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:15.853 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:15.853 Malloc0 : 2.01 37910.72 37.02 0.00 0.00 6741.15 1575.38 9931.22 00:26:15.853 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:15.853 Malloc0 : 2.01 37880.82 36.99 0.00 0.00 6733.62 1537.58 8368.44 00:26:15.853 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:15.853 Malloc0 : 2.02 37851.05 36.96 0.00 0.00 6726.39 1537.58 7007.31 00:26:15.853 =================================================================================================================== 00:26:15.853 Total : 113642.59 110.98 0.00 0.00 6733.72 1537.58 9931.22' 00:26:15.853 17:38:26 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 17:38:24.123662] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:15.853 [2024-07-15 17:38:24.123722] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2921448 ] 00:26:15.853 Using job config with 3 jobs 00:26:15.853 [2024-07-15 17:38:24.232129] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:15.853 [2024-07-15 17:38:24.320536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:15.853 cpumask for '\''job0'\'' is too big 00:26:15.853 cpumask for '\''job1'\'' is too big 00:26:15.853 cpumask for '\''job2'\'' is too big 00:26:15.853 Running I/O for 2 seconds... 00:26:15.853 00:26:15.853 Latency(us) 00:26:15.853 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:15.853 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:15.853 Malloc0 : 2.01 37910.72 37.02 0.00 0.00 6741.15 1575.38 9931.22 00:26:15.853 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:15.853 Malloc0 : 2.01 37880.82 36.99 0.00 0.00 6733.62 1537.58 8368.44 00:26:15.853 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:15.853 Malloc0 : 2.02 37851.05 36.96 0.00 0.00 6726.39 1537.58 7007.31 00:26:15.853 =================================================================================================================== 00:26:15.853 Total : 113642.59 110.98 0.00 0.00 6733.72 1537.58 9931.22' 00:26:15.853 17:38:26 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:26:15.853 17:38:26 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:26:15.853 17:38:26 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:26:15.853 17:38:26 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:26:15.853 17:38:26 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:15.854 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:15.854 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:15.854 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:15.854 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:15.854 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:15.854 17:38:26 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:18.398 17:38:29 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-15 17:38:26.693381] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:18.398 [2024-07-15 17:38:26.693441] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2921965 ] 00:26:18.398 Using job config with 4 jobs 00:26:18.398 [2024-07-15 17:38:26.793638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:18.398 [2024-07-15 17:38:26.864703] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:18.398 cpumask for '\''job0'\'' is too big 00:26:18.398 cpumask for '\''job1'\'' is too big 00:26:18.398 cpumask for '\''job2'\'' is too big 00:26:18.398 cpumask for '\''job3'\'' is too big 00:26:18.398 Running I/O for 2 seconds... 00:26:18.398 00:26:18.398 Latency(us) 00:26:18.398 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.03 13877.46 13.55 0.00 0.00 18435.30 3327.21 28432.54 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.03 13865.69 13.54 0.00 0.00 18438.69 3982.57 28432.54 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.03 13853.71 13.53 0.00 0.00 18397.99 3251.59 25206.15 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.03 13842.02 13.52 0.00 0.00 18397.68 3932.16 25206.15 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.04 13830.58 13.51 0.00 0.00 18354.99 3276.80 21878.94 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.04 13818.82 13.49 0.00 0.00 18354.84 3982.57 21878.94 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.04 13901.23 13.58 0.00 0.00 18190.56 3112.96 18753.38 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.05 13889.55 13.56 0.00 0.00 18189.77 2520.62 18753.38 00:26:18.399 =================================================================================================================== 00:26:18.399 Total : 110879.07 108.28 0.00 0.00 18344.63 2520.62 28432.54' 00:26:18.399 17:38:29 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-15 17:38:26.693381] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:18.399 [2024-07-15 17:38:26.693441] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2921965 ] 00:26:18.399 Using job config with 4 jobs 00:26:18.399 [2024-07-15 17:38:26.793638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:18.399 [2024-07-15 17:38:26.864703] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:18.399 cpumask for '\''job0'\'' is too big 00:26:18.399 cpumask for '\''job1'\'' is too big 00:26:18.399 cpumask for '\''job2'\'' is too big 00:26:18.399 cpumask for '\''job3'\'' is too big 00:26:18.399 Running I/O for 2 seconds... 00:26:18.399 00:26:18.399 Latency(us) 00:26:18.399 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.03 13877.46 13.55 0.00 0.00 18435.30 3327.21 28432.54 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.03 13865.69 13.54 0.00 0.00 18438.69 3982.57 28432.54 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.03 13853.71 13.53 0.00 0.00 18397.99 3251.59 25206.15 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.03 13842.02 13.52 0.00 0.00 18397.68 3932.16 25206.15 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.04 13830.58 13.51 0.00 0.00 18354.99 3276.80 21878.94 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.04 13818.82 13.49 0.00 0.00 18354.84 3982.57 21878.94 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.04 13901.23 13.58 0.00 0.00 18190.56 3112.96 18753.38 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.05 13889.55 13.56 0.00 0.00 18189.77 2520.62 18753.38 00:26:18.399 =================================================================================================================== 00:26:18.399 Total : 110879.07 108.28 0.00 0.00 18344.63 2520.62 28432.54' 00:26:18.399 17:38:29 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 17:38:26.693381] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:18.399 [2024-07-15 17:38:26.693441] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2921965 ] 00:26:18.399 Using job config with 4 jobs 00:26:18.399 [2024-07-15 17:38:26.793638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:18.399 [2024-07-15 17:38:26.864703] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:18.399 cpumask for '\''job0'\'' is too big 00:26:18.399 cpumask for '\''job1'\'' is too big 00:26:18.399 cpumask for '\''job2'\'' is too big 00:26:18.399 cpumask for '\''job3'\'' is too big 00:26:18.399 Running I/O for 2 seconds... 00:26:18.399 00:26:18.399 Latency(us) 00:26:18.399 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.03 13877.46 13.55 0.00 0.00 18435.30 3327.21 28432.54 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.03 13865.69 13.54 0.00 0.00 18438.69 3982.57 28432.54 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.03 13853.71 13.53 0.00 0.00 18397.99 3251.59 25206.15 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.03 13842.02 13.52 0.00 0.00 18397.68 3932.16 25206.15 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.04 13830.58 13.51 0.00 0.00 18354.99 3276.80 21878.94 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.04 13818.82 13.49 0.00 0.00 18354.84 3982.57 21878.94 00:26:18.399 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc0 : 2.04 13901.23 13.58 0.00 0.00 18190.56 3112.96 18753.38 00:26:18.399 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:18.399 Malloc1 : 2.05 13889.55 13.56 0.00 0.00 18189.77 2520.62 18753.38 00:26:18.399 =================================================================================================================== 00:26:18.399 Total : 110879.07 108.28 0.00 0.00 18344.63 2520.62 28432.54' 00:26:18.399 17:38:29 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:26:18.399 17:38:29 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:26:18.399 17:38:29 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:26:18.399 17:38:29 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:26:18.399 17:38:29 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:18.399 17:38:29 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:26:18.399 00:26:18.399 real 0m10.278s 00:26:18.399 user 0m9.309s 00:26:18.399 sys 0m0.835s 00:26:18.399 17:38:29 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:18.399 17:38:29 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:26:18.399 ************************************ 00:26:18.399 END TEST bdevperf_config 00:26:18.399 ************************************ 00:26:18.399 17:38:29 -- common/autotest_common.sh@1142 -- # return 0 00:26:18.399 17:38:29 -- spdk/autotest.sh@192 -- # uname -s 00:26:18.399 17:38:29 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:26:18.399 17:38:29 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:26:18.399 17:38:29 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:26:18.399 17:38:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:18.399 17:38:29 -- common/autotest_common.sh@10 -- # set +x 00:26:18.399 ************************************ 00:26:18.399 START TEST reactor_set_interrupt 00:26:18.399 ************************************ 00:26:18.399 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:26:18.399 * Looking for test storage... 00:26:18.399 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:18.399 17:38:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:26:18.399 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:26:18.399 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:18.399 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:18.399 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:26:18.399 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:18.399 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:26:18.399 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:26:18.399 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:26:18.399 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:26:18.399 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:26:18.399 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:26:18.399 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:26:18.399 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:26:18.399 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:26:18.399 17:38:29 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:26:18.399 17:38:29 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:26:18.399 17:38:29 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:26:18.399 17:38:29 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:26:18.399 17:38:29 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:26:18.399 17:38:29 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:26:18.399 17:38:29 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:26:18.400 17:38:29 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:26:18.400 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:26:18.400 17:38:29 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:26:18.400 #define SPDK_CONFIG_H 00:26:18.400 #define SPDK_CONFIG_APPS 1 00:26:18.400 #define SPDK_CONFIG_ARCH native 00:26:18.400 #undef SPDK_CONFIG_ASAN 00:26:18.400 #undef SPDK_CONFIG_AVAHI 00:26:18.400 #undef SPDK_CONFIG_CET 00:26:18.400 #define SPDK_CONFIG_COVERAGE 1 00:26:18.400 #define SPDK_CONFIG_CROSS_PREFIX 00:26:18.400 #define SPDK_CONFIG_CRYPTO 1 00:26:18.400 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:26:18.400 #undef SPDK_CONFIG_CUSTOMOCF 00:26:18.400 #undef SPDK_CONFIG_DAOS 00:26:18.400 #define SPDK_CONFIG_DAOS_DIR 00:26:18.400 #define SPDK_CONFIG_DEBUG 1 00:26:18.400 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:26:18.400 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:18.400 #define SPDK_CONFIG_DPDK_INC_DIR 00:26:18.400 #define SPDK_CONFIG_DPDK_LIB_DIR 00:26:18.400 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:26:18.400 #undef SPDK_CONFIG_DPDK_UADK 00:26:18.400 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:18.400 #define SPDK_CONFIG_EXAMPLES 1 00:26:18.400 #undef SPDK_CONFIG_FC 00:26:18.400 #define SPDK_CONFIG_FC_PATH 00:26:18.400 #define SPDK_CONFIG_FIO_PLUGIN 1 00:26:18.400 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:26:18.400 #undef SPDK_CONFIG_FUSE 00:26:18.400 #undef SPDK_CONFIG_FUZZER 00:26:18.400 #define SPDK_CONFIG_FUZZER_LIB 00:26:18.400 #undef SPDK_CONFIG_GOLANG 00:26:18.400 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:26:18.400 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:26:18.400 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:26:18.400 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:26:18.400 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:26:18.400 #undef SPDK_CONFIG_HAVE_LIBBSD 00:26:18.400 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:26:18.400 #define SPDK_CONFIG_IDXD 1 00:26:18.400 #define SPDK_CONFIG_IDXD_KERNEL 1 00:26:18.400 #define SPDK_CONFIG_IPSEC_MB 1 00:26:18.400 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:18.400 #define SPDK_CONFIG_ISAL 1 00:26:18.400 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:26:18.400 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:26:18.400 #define SPDK_CONFIG_LIBDIR 00:26:18.400 #undef SPDK_CONFIG_LTO 00:26:18.400 #define SPDK_CONFIG_MAX_LCORES 128 00:26:18.400 #define SPDK_CONFIG_NVME_CUSE 1 00:26:18.400 #undef SPDK_CONFIG_OCF 00:26:18.400 #define SPDK_CONFIG_OCF_PATH 00:26:18.400 #define SPDK_CONFIG_OPENSSL_PATH 00:26:18.400 #undef SPDK_CONFIG_PGO_CAPTURE 00:26:18.400 #define SPDK_CONFIG_PGO_DIR 00:26:18.400 #undef SPDK_CONFIG_PGO_USE 00:26:18.400 #define SPDK_CONFIG_PREFIX /usr/local 00:26:18.400 #undef SPDK_CONFIG_RAID5F 00:26:18.400 #undef SPDK_CONFIG_RBD 00:26:18.400 #define SPDK_CONFIG_RDMA 1 00:26:18.400 #define SPDK_CONFIG_RDMA_PROV verbs 00:26:18.400 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:26:18.400 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:26:18.400 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:26:18.400 #define SPDK_CONFIG_SHARED 1 00:26:18.400 #undef SPDK_CONFIG_SMA 00:26:18.400 #define SPDK_CONFIG_TESTS 1 00:26:18.400 #undef SPDK_CONFIG_TSAN 00:26:18.401 #define SPDK_CONFIG_UBLK 1 00:26:18.401 #define SPDK_CONFIG_UBSAN 1 00:26:18.401 #undef SPDK_CONFIG_UNIT_TESTS 00:26:18.401 #undef SPDK_CONFIG_URING 00:26:18.401 #define SPDK_CONFIG_URING_PATH 00:26:18.401 #undef SPDK_CONFIG_URING_ZNS 00:26:18.401 #undef SPDK_CONFIG_USDT 00:26:18.401 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:26:18.401 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:26:18.401 #undef SPDK_CONFIG_VFIO_USER 00:26:18.401 #define SPDK_CONFIG_VFIO_USER_DIR 00:26:18.401 #define SPDK_CONFIG_VHOST 1 00:26:18.401 #define SPDK_CONFIG_VIRTIO 1 00:26:18.401 #undef SPDK_CONFIG_VTUNE 00:26:18.401 #define SPDK_CONFIG_VTUNE_DIR 00:26:18.401 #define SPDK_CONFIG_WERROR 1 00:26:18.401 #define SPDK_CONFIG_WPDK_DIR 00:26:18.401 #undef SPDK_CONFIG_XNVME 00:26:18.401 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:26:18.401 17:38:29 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:18.401 17:38:29 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:18.401 17:38:29 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:18.401 17:38:29 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:18.401 17:38:29 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:18.401 17:38:29 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:18.401 17:38:29 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:18.401 17:38:29 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:26:18.401 17:38:29 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:26:18.401 17:38:29 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:26:18.401 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j128 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 2922326 ]] 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 2922326 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:26:18.402 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.SNCR78 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.SNCR78/tests/interrupt /tmp/spdk.SNCR78 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=954712064 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4329717760 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=123506892800 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=129376292864 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5869400064 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=64683433984 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688144384 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=25865388032 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=25875259392 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9871360 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=339968 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=163840 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=64687472640 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688148480 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=675840 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=12937621504 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=12937625600 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:26:18.403 * Looking for test storage... 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=123506892800 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=8083992576 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:18.403 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:26:18.403 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:26:18.403 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:26:18.403 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:18.403 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:26:18.403 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:26:18.403 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:26:18.403 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:26:18.403 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:26:18.404 17:38:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:18.404 17:38:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:18.404 17:38:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:26:18.404 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:18.404 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:18.404 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2922395 00:26:18.404 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:18.404 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2922395 /var/tmp/spdk.sock 00:26:18.404 17:38:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:18.404 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2922395 ']' 00:26:18.404 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:18.404 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:18.404 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:18.404 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:18.404 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:18.404 17:38:29 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:18.404 [2024-07-15 17:38:29.579861] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:18.404 [2024-07-15 17:38:29.579929] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2922395 ] 00:26:18.404 [2024-07-15 17:38:29.669080] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:18.664 [2024-07-15 17:38:29.738207] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:18.664 [2024-07-15 17:38:29.738347] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:18.664 [2024-07-15 17:38:29.738347] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:18.664 [2024-07-15 17:38:29.788210] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:19.233 17:38:30 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:19.233 17:38:30 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:26:19.233 17:38:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:26:19.233 17:38:30 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:19.493 Malloc0 00:26:19.493 Malloc1 00:26:19.493 Malloc2 00:26:19.493 17:38:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:26:19.493 17:38:30 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:26:19.493 17:38:30 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:19.493 17:38:30 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:19.493 5000+0 records in 00:26:19.493 5000+0 records out 00:26:19.493 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0171334 s, 598 MB/s 00:26:19.493 17:38:30 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:19.754 AIO0 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 2922395 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 2922395 without_thd 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2922395 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:19.754 17:38:30 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:20.014 17:38:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:26:20.014 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:26:20.014 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:26:20.014 17:38:31 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:26:20.014 17:38:31 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:20.014 17:38:31 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:26:20.014 17:38:31 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:20.014 17:38:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:20.014 17:38:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:26:20.274 spdk_thread ids are 1 on reactor0. 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2922395 0 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2922395 0 idle 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2922395 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2922395 -w 256 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2922395 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:00.30 reactor_0' 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2922395 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:00.30 reactor_0 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:20.274 17:38:31 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2922395 1 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2922395 1 idle 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2922395 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2922395 -w 256 00:26:20.275 17:38:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2922431 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:00.00 reactor_1' 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2922431 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:00.00 reactor_1 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2922395 2 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2922395 2 idle 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2922395 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2922395 -w 256 00:26:20.534 17:38:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2922432 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:00.00 reactor_2' 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2922432 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:00.00 reactor_2 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:26:20.793 17:38:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:26:20.793 [2024-07-15 17:38:32.043612] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:20.793 17:38:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:26:21.053 [2024-07-15 17:38:32.254930] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:26:21.053 [2024-07-15 17:38:32.255398] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:21.053 17:38:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:26:21.312 [2024-07-15 17:38:32.470800] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:26:21.312 [2024-07-15 17:38:32.471150] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2922395 0 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2922395 0 busy 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2922395 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2922395 -w 256 00:26:21.312 17:38:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2922395 root 20 0 128.2g 35840 22528 R 99.9 0.0 0:00.71 reactor_0' 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2922395 root 20 0 128.2g 35840 22528 R 99.9 0.0 0:00.71 reactor_0 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2922395 2 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2922395 2 busy 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2922395 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2922395 -w 256 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2922432 root 20 0 128.2g 35840 22528 R 99.9 0.0 0:00.37 reactor_2' 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2922432 root 20 0 128.2g 35840 22528 R 99.9 0.0 0:00.37 reactor_2 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:21.572 17:38:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:26:21.831 [2024-07-15 17:38:33.034794] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:26:21.832 [2024-07-15 17:38:33.035002] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2922395 2 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2922395 2 idle 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2922395 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2922395 -w 256 00:26:21.832 17:38:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2922432 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:00.56 reactor_2' 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2922432 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:00.56 reactor_2 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:22.097 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:26:22.356 [2024-07-15 17:38:33.422798] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:26:22.357 [2024-07-15 17:38:33.423115] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:26:22.357 [2024-07-15 17:38:33.603087] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2922395 0 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2922395 0 idle 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2922395 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2922395 -w 256 00:26:22.357 17:38:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:22.616 17:38:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2922395 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:01.47 reactor_0' 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2922395 root 20 0 128.2g 35840 22528 S 0.0 0.0 0:01.47 reactor_0 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:26:22.617 17:38:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 2922395 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2922395 ']' 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2922395 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2922395 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2922395' 00:26:22.617 killing process with pid 2922395 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2922395 00:26:22.617 17:38:33 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2922395 00:26:22.877 17:38:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:26:22.877 17:38:34 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:22.877 17:38:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:26:22.877 17:38:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:22.877 17:38:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:22.877 17:38:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2923334 00:26:22.877 17:38:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:22.877 17:38:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:22.877 17:38:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2923334 /var/tmp/spdk.sock 00:26:22.877 17:38:34 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2923334 ']' 00:26:22.877 17:38:34 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:22.877 17:38:34 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:22.877 17:38:34 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:22.877 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:22.877 17:38:34 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:22.877 17:38:34 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:22.877 [2024-07-15 17:38:34.070059] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:22.877 [2024-07-15 17:38:34.070111] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2923334 ] 00:26:22.877 [2024-07-15 17:38:34.156972] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:23.137 [2024-07-15 17:38:34.222780] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:23.137 [2024-07-15 17:38:34.223034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:23.137 [2024-07-15 17:38:34.223036] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:23.137 [2024-07-15 17:38:34.273029] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:23.707 17:38:34 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:23.707 17:38:34 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:26:23.707 17:38:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:26:23.707 17:38:34 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:23.967 Malloc0 00:26:23.967 Malloc1 00:26:23.967 Malloc2 00:26:23.967 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:26:23.967 17:38:35 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:26:23.967 17:38:35 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:23.967 17:38:35 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:23.967 5000+0 records in 00:26:23.967 5000+0 records out 00:26:23.967 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0175434 s, 584 MB/s 00:26:23.967 17:38:35 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:24.227 AIO0 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 2923334 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 2923334 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2923334 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:24.227 17:38:35 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:26:24.487 spdk_thread ids are 1 on reactor0. 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2923334 0 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2923334 0 idle 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2923334 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2923334 -w 256 00:26:24.487 17:38:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2923334 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.30 reactor_0' 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2923334 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.30 reactor_0 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2923334 1 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2923334 1 idle 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2923334 00:26:24.746 17:38:35 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:26:24.747 17:38:35 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:24.747 17:38:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:24.747 17:38:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:24.747 17:38:35 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:24.747 17:38:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:24.747 17:38:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:24.747 17:38:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2923334 -w 256 00:26:24.747 17:38:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2923338 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_1' 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2923338 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_1 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2923334 2 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2923334 2 idle 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2923334 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2923334 -w 256 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2923339 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_2' 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2923339 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_2 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:26:25.006 17:38:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:26:25.266 [2024-07-15 17:38:36.439650] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:26:25.266 [2024-07-15 17:38:36.439920] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:26:25.266 [2024-07-15 17:38:36.440240] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:25.266 17:38:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:26:25.526 [2024-07-15 17:38:36.647992] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:26:25.526 [2024-07-15 17:38:36.648305] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2923334 0 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2923334 0 busy 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2923334 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2923334 -w 256 00:26:25.526 17:38:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2923334 root 20 0 128.2g 36864 23552 R 93.8 0.0 0:00.69 reactor_0' 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2923334 root 20 0 128.2g 36864 23552 R 93.8 0.0 0:00.69 reactor_0 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2923334 2 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2923334 2 busy 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2923334 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2923334 -w 256 00:26:25.786 17:38:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2923339 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.36 reactor_2' 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2923339 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.36 reactor_2 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:25.786 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:26:26.046 [2024-07-15 17:38:37.217468] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:26:26.047 [2024-07-15 17:38:37.217651] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2923334 2 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2923334 2 idle 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2923334 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2923334 -w 256 00:26:26.047 17:38:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2923339 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.56 reactor_2' 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2923339 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.56 reactor_2 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:26:26.307 [2024-07-15 17:38:37.578362] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:26:26.307 [2024-07-15 17:38:37.578721] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:26:26.307 [2024-07-15 17:38:37.578743] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2923334 0 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2923334 0 idle 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2923334 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:26.307 17:38:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2923334 -w 256 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2923334 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:01.43 reactor_0' 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2923334 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:01.43 reactor_0 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:26:26.568 17:38:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 2923334 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2923334 ']' 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2923334 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2923334 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2923334' 00:26:26.568 killing process with pid 2923334 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2923334 00:26:26.568 17:38:37 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2923334 00:26:26.829 17:38:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:26:26.829 17:38:38 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:26.829 00:26:26.829 real 0m8.757s 00:26:26.829 user 0m8.087s 00:26:26.829 sys 0m1.673s 00:26:26.829 17:38:38 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:26.829 17:38:38 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:26.829 ************************************ 00:26:26.829 END TEST reactor_set_interrupt 00:26:26.829 ************************************ 00:26:26.829 17:38:38 -- common/autotest_common.sh@1142 -- # return 0 00:26:26.829 17:38:38 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:26.829 17:38:38 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:26:26.829 17:38:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:26.829 17:38:38 -- common/autotest_common.sh@10 -- # set +x 00:26:26.829 ************************************ 00:26:26.829 START TEST reap_unregistered_poller 00:26:26.829 ************************************ 00:26:26.829 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:27.092 * Looking for test storage... 00:26:27.092 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:27.092 17:38:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:26:27.092 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:27.092 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:27.092 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:27.092 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:26:27.092 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:27.092 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:26:27.092 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:26:27.092 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:26:27.092 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:26:27.092 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:26:27.092 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:26:27.092 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:26:27.092 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:26:27.092 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:26:27.092 17:38:38 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:26:27.093 17:38:38 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:26:27.093 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:26:27.093 #define SPDK_CONFIG_H 00:26:27.093 #define SPDK_CONFIG_APPS 1 00:26:27.093 #define SPDK_CONFIG_ARCH native 00:26:27.093 #undef SPDK_CONFIG_ASAN 00:26:27.093 #undef SPDK_CONFIG_AVAHI 00:26:27.093 #undef SPDK_CONFIG_CET 00:26:27.093 #define SPDK_CONFIG_COVERAGE 1 00:26:27.093 #define SPDK_CONFIG_CROSS_PREFIX 00:26:27.093 #define SPDK_CONFIG_CRYPTO 1 00:26:27.093 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:26:27.093 #undef SPDK_CONFIG_CUSTOMOCF 00:26:27.093 #undef SPDK_CONFIG_DAOS 00:26:27.093 #define SPDK_CONFIG_DAOS_DIR 00:26:27.093 #define SPDK_CONFIG_DEBUG 1 00:26:27.093 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:26:27.093 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:27.093 #define SPDK_CONFIG_DPDK_INC_DIR 00:26:27.093 #define SPDK_CONFIG_DPDK_LIB_DIR 00:26:27.093 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:26:27.093 #undef SPDK_CONFIG_DPDK_UADK 00:26:27.093 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:27.093 #define SPDK_CONFIG_EXAMPLES 1 00:26:27.093 #undef SPDK_CONFIG_FC 00:26:27.093 #define SPDK_CONFIG_FC_PATH 00:26:27.093 #define SPDK_CONFIG_FIO_PLUGIN 1 00:26:27.093 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:26:27.093 #undef SPDK_CONFIG_FUSE 00:26:27.093 #undef SPDK_CONFIG_FUZZER 00:26:27.093 #define SPDK_CONFIG_FUZZER_LIB 00:26:27.093 #undef SPDK_CONFIG_GOLANG 00:26:27.093 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:26:27.093 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:26:27.093 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:26:27.093 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:26:27.093 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:26:27.093 #undef SPDK_CONFIG_HAVE_LIBBSD 00:26:27.093 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:26:27.093 #define SPDK_CONFIG_IDXD 1 00:26:27.093 #define SPDK_CONFIG_IDXD_KERNEL 1 00:26:27.093 #define SPDK_CONFIG_IPSEC_MB 1 00:26:27.093 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:27.093 #define SPDK_CONFIG_ISAL 1 00:26:27.093 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:26:27.093 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:26:27.093 #define SPDK_CONFIG_LIBDIR 00:26:27.093 #undef SPDK_CONFIG_LTO 00:26:27.093 #define SPDK_CONFIG_MAX_LCORES 128 00:26:27.093 #define SPDK_CONFIG_NVME_CUSE 1 00:26:27.093 #undef SPDK_CONFIG_OCF 00:26:27.093 #define SPDK_CONFIG_OCF_PATH 00:26:27.093 #define SPDK_CONFIG_OPENSSL_PATH 00:26:27.093 #undef SPDK_CONFIG_PGO_CAPTURE 00:26:27.093 #define SPDK_CONFIG_PGO_DIR 00:26:27.093 #undef SPDK_CONFIG_PGO_USE 00:26:27.093 #define SPDK_CONFIG_PREFIX /usr/local 00:26:27.093 #undef SPDK_CONFIG_RAID5F 00:26:27.093 #undef SPDK_CONFIG_RBD 00:26:27.093 #define SPDK_CONFIG_RDMA 1 00:26:27.093 #define SPDK_CONFIG_RDMA_PROV verbs 00:26:27.093 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:26:27.093 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:26:27.093 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:26:27.093 #define SPDK_CONFIG_SHARED 1 00:26:27.093 #undef SPDK_CONFIG_SMA 00:26:27.093 #define SPDK_CONFIG_TESTS 1 00:26:27.093 #undef SPDK_CONFIG_TSAN 00:26:27.093 #define SPDK_CONFIG_UBLK 1 00:26:27.093 #define SPDK_CONFIG_UBSAN 1 00:26:27.093 #undef SPDK_CONFIG_UNIT_TESTS 00:26:27.093 #undef SPDK_CONFIG_URING 00:26:27.093 #define SPDK_CONFIG_URING_PATH 00:26:27.093 #undef SPDK_CONFIG_URING_ZNS 00:26:27.093 #undef SPDK_CONFIG_USDT 00:26:27.093 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:26:27.093 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:26:27.093 #undef SPDK_CONFIG_VFIO_USER 00:26:27.093 #define SPDK_CONFIG_VFIO_USER_DIR 00:26:27.093 #define SPDK_CONFIG_VHOST 1 00:26:27.093 #define SPDK_CONFIG_VIRTIO 1 00:26:27.093 #undef SPDK_CONFIG_VTUNE 00:26:27.093 #define SPDK_CONFIG_VTUNE_DIR 00:26:27.093 #define SPDK_CONFIG_WERROR 1 00:26:27.093 #define SPDK_CONFIG_WPDK_DIR 00:26:27.093 #undef SPDK_CONFIG_XNVME 00:26:27.093 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:26:27.093 17:38:38 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:26:27.093 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:27.093 17:38:38 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:27.093 17:38:38 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:27.093 17:38:38 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:27.093 17:38:38 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:27.093 17:38:38 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:27.093 17:38:38 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:27.093 17:38:38 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:26:27.093 17:38:38 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:27.093 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:26:27.093 17:38:38 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:26:27.094 17:38:38 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:26:27.094 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j128 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 2924054 ]] 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 2924054 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.MEmNM5 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.MEmNM5/tests/interrupt /tmp/spdk.MEmNM5 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:26:27.095 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=954712064 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4329717760 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=123506728960 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=129376292864 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5869563904 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=64683433984 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688144384 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=25865388032 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=25875259392 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9871360 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=339968 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=163840 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=64687472640 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688148480 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=675840 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=12937621504 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=12937625600 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:26:27.096 * Looking for test storage... 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=123506728960 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=8084156416 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:27.096 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:26:27.096 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2924095 00:26:27.096 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:27.097 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2924095 /var/tmp/spdk.sock 00:26:27.097 17:38:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:27.097 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 2924095 ']' 00:26:27.097 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:27.097 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:27.097 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:27.097 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:27.097 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:27.097 17:38:38 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:27.357 [2024-07-15 17:38:38.397808] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:27.357 [2024-07-15 17:38:38.397874] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2924095 ] 00:26:27.357 [2024-07-15 17:38:38.489798] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:27.357 [2024-07-15 17:38:38.584349] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:27.357 [2024-07-15 17:38:38.584503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:27.357 [2024-07-15 17:38:38.584503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:27.617 [2024-07-15 17:38:38.655349] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:28.187 17:38:39 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:28.187 17:38:39 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:26:28.187 17:38:39 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:28.187 17:38:39 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:28.187 17:38:39 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:26:28.187 "name": "app_thread", 00:26:28.187 "id": 1, 00:26:28.187 "active_pollers": [], 00:26:28.187 "timed_pollers": [ 00:26:28.187 { 00:26:28.187 "name": "rpc_subsystem_poll_servers", 00:26:28.187 "id": 1, 00:26:28.187 "state": "waiting", 00:26:28.187 "run_count": 0, 00:26:28.187 "busy_count": 0, 00:26:28.187 "period_ticks": 10400000 00:26:28.187 } 00:26:28.187 ], 00:26:28.187 "paused_pollers": [] 00:26:28.187 }' 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:28.187 5000+0 records in 00:26:28.187 5000+0 records out 00:26:28.187 10240000 bytes (10 MB, 9.8 MiB) copied, 0.00916222 s, 1.1 GB/s 00:26:28.187 17:38:39 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:28.758 AIO0 00:26:28.758 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:29.019 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:26:29.278 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:29.278 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:29.278 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:26:29.278 "name": "app_thread", 00:26:29.278 "id": 1, 00:26:29.278 "active_pollers": [], 00:26:29.278 "timed_pollers": [ 00:26:29.278 { 00:26:29.278 "name": "rpc_subsystem_poll_servers", 00:26:29.278 "id": 1, 00:26:29.278 "state": "waiting", 00:26:29.278 "run_count": 0, 00:26:29.278 "busy_count": 0, 00:26:29.278 "period_ticks": 10400000 00:26:29.278 } 00:26:29.278 ], 00:26:29.278 "paused_pollers": [] 00:26:29.278 }' 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:26:29.278 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 2924095 00:26:29.278 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 2924095 ']' 00:26:29.278 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 2924095 00:26:29.278 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:26:29.278 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:29.278 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2924095 00:26:29.536 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:29.536 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:29.536 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2924095' 00:26:29.536 killing process with pid 2924095 00:26:29.536 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 2924095 00:26:29.536 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 2924095 00:26:29.536 17:38:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:26:29.536 17:38:40 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:29.536 00:26:29.536 real 0m2.683s 00:26:29.536 user 0m1.792s 00:26:29.536 sys 0m0.603s 00:26:29.536 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:29.536 17:38:40 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:29.536 ************************************ 00:26:29.536 END TEST reap_unregistered_poller 00:26:29.536 ************************************ 00:26:29.536 17:38:40 -- common/autotest_common.sh@1142 -- # return 0 00:26:29.536 17:38:40 -- spdk/autotest.sh@198 -- # uname -s 00:26:29.536 17:38:40 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:26:29.536 17:38:40 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:26:29.536 17:38:40 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:26:29.536 17:38:40 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:26:29.536 17:38:40 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:26:29.536 17:38:40 -- spdk/autotest.sh@260 -- # timing_exit lib 00:26:29.536 17:38:40 -- common/autotest_common.sh@728 -- # xtrace_disable 00:26:29.536 17:38:40 -- common/autotest_common.sh@10 -- # set +x 00:26:29.796 17:38:40 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:26:29.796 17:38:40 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:26:29.796 17:38:40 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:29.796 17:38:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:29.796 17:38:40 -- common/autotest_common.sh@10 -- # set +x 00:26:29.796 ************************************ 00:26:29.796 START TEST compress_compdev 00:26:29.796 ************************************ 00:26:29.796 17:38:40 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:26:29.796 * Looking for test storage... 00:26:29.796 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:26:29.796 17:38:40 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:26:29.796 17:38:40 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:26:29.796 17:38:41 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:26:29.796 17:38:41 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:26:29.796 17:38:41 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:26:29.796 17:38:41 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:26:29.796 17:38:41 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:26:29.796 17:38:41 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:26:29.796 17:38:41 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:29.797 17:38:41 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:29.797 17:38:41 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:29.797 17:38:41 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:29.797 17:38:41 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:29.797 17:38:41 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:29.797 17:38:41 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:29.797 17:38:41 compress_compdev -- paths/export.sh@5 -- # export PATH 00:26:29.797 17:38:41 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:26:29.797 17:38:41 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:26:29.797 17:38:41 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:29.797 17:38:41 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:26:29.797 17:38:41 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:26:29.797 17:38:41 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:26:29.797 17:38:41 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:29.797 17:38:41 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2924576 00:26:29.797 17:38:41 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:29.797 17:38:41 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2924576 00:26:29.797 17:38:41 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2924576 ']' 00:26:29.797 17:38:41 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:29.797 17:38:41 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:29.797 17:38:41 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:29.797 17:38:41 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:29.797 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:29.797 17:38:41 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:29.797 17:38:41 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:29.797 [2024-07-15 17:38:41.089298] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:29.797 [2024-07-15 17:38:41.089365] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2924576 ] 00:26:30.057 [2024-07-15 17:38:41.171275] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:30.057 [2024-07-15 17:38:41.271861] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:30.057 [2024-07-15 17:38:41.271988] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:30.625 [2024-07-15 17:38:41.821142] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:30.923 17:38:41 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:30.923 17:38:41 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:26:30.923 17:38:41 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:26:30.923 17:38:41 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:30.923 17:38:41 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:34.222 [2024-07-15 17:38:44.995917] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x119a670 PMD being used: compress_qat 00:26:34.222 17:38:45 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:34.222 17:38:45 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:26:34.222 17:38:45 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:34.222 17:38:45 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:26:34.222 17:38:45 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:34.222 17:38:45 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:34.222 17:38:45 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:34.222 17:38:45 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:34.222 [ 00:26:34.222 { 00:26:34.222 "name": "Nvme0n1", 00:26:34.222 "aliases": [ 00:26:34.222 "ef756178-9494-42db-aea9-c036b023b0be" 00:26:34.222 ], 00:26:34.222 "product_name": "NVMe disk", 00:26:34.222 "block_size": 512, 00:26:34.222 "num_blocks": 3907029168, 00:26:34.222 "uuid": "ef756178-9494-42db-aea9-c036b023b0be", 00:26:34.222 "assigned_rate_limits": { 00:26:34.222 "rw_ios_per_sec": 0, 00:26:34.222 "rw_mbytes_per_sec": 0, 00:26:34.222 "r_mbytes_per_sec": 0, 00:26:34.222 "w_mbytes_per_sec": 0 00:26:34.222 }, 00:26:34.222 "claimed": false, 00:26:34.222 "zoned": false, 00:26:34.222 "supported_io_types": { 00:26:34.222 "read": true, 00:26:34.222 "write": true, 00:26:34.222 "unmap": true, 00:26:34.222 "flush": true, 00:26:34.222 "reset": true, 00:26:34.222 "nvme_admin": true, 00:26:34.222 "nvme_io": true, 00:26:34.222 "nvme_io_md": false, 00:26:34.222 "write_zeroes": true, 00:26:34.222 "zcopy": false, 00:26:34.222 "get_zone_info": false, 00:26:34.222 "zone_management": false, 00:26:34.222 "zone_append": false, 00:26:34.222 "compare": false, 00:26:34.222 "compare_and_write": false, 00:26:34.222 "abort": true, 00:26:34.222 "seek_hole": false, 00:26:34.222 "seek_data": false, 00:26:34.222 "copy": false, 00:26:34.222 "nvme_iov_md": false 00:26:34.222 }, 00:26:34.222 "driver_specific": { 00:26:34.222 "nvme": [ 00:26:34.222 { 00:26:34.222 "pci_address": "0000:65:00.0", 00:26:34.222 "trid": { 00:26:34.222 "trtype": "PCIe", 00:26:34.222 "traddr": "0000:65:00.0" 00:26:34.222 }, 00:26:34.222 "ctrlr_data": { 00:26:34.222 "cntlid": 0, 00:26:34.222 "vendor_id": "0x8086", 00:26:34.222 "model_number": "INTEL SSDPE2KX020T8", 00:26:34.222 "serial_number": "PHLJ9512038S2P0BGN", 00:26:34.222 "firmware_revision": "VDV10184", 00:26:34.222 "oacs": { 00:26:34.222 "security": 0, 00:26:34.222 "format": 1, 00:26:34.222 "firmware": 1, 00:26:34.222 "ns_manage": 1 00:26:34.222 }, 00:26:34.222 "multi_ctrlr": false, 00:26:34.222 "ana_reporting": false 00:26:34.222 }, 00:26:34.222 "vs": { 00:26:34.222 "nvme_version": "1.2" 00:26:34.222 }, 00:26:34.222 "ns_data": { 00:26:34.222 "id": 1, 00:26:34.222 "can_share": false 00:26:34.222 } 00:26:34.222 } 00:26:34.222 ], 00:26:34.222 "mp_policy": "active_passive" 00:26:34.222 } 00:26:34.222 } 00:26:34.222 ] 00:26:34.222 17:38:45 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:26:34.222 17:38:45 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:34.483 [2024-07-15 17:38:45.629303] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfe8730 PMD being used: compress_qat 00:26:35.423 38d9d922-1bff-48c2-82a9-8ccd3fbed3fb 00:26:35.423 17:38:46 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:35.686 63733ba1-dd00-4bfd-b49a-b34c977281fc 00:26:35.686 17:38:46 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:35.686 17:38:46 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:26:35.686 17:38:46 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:35.686 17:38:46 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:26:35.686 17:38:46 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:35.686 17:38:46 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:35.686 17:38:46 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:35.947 17:38:47 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:36.208 [ 00:26:36.208 { 00:26:36.208 "name": "63733ba1-dd00-4bfd-b49a-b34c977281fc", 00:26:36.208 "aliases": [ 00:26:36.208 "lvs0/lv0" 00:26:36.208 ], 00:26:36.208 "product_name": "Logical Volume", 00:26:36.208 "block_size": 512, 00:26:36.208 "num_blocks": 204800, 00:26:36.208 "uuid": "63733ba1-dd00-4bfd-b49a-b34c977281fc", 00:26:36.208 "assigned_rate_limits": { 00:26:36.208 "rw_ios_per_sec": 0, 00:26:36.208 "rw_mbytes_per_sec": 0, 00:26:36.208 "r_mbytes_per_sec": 0, 00:26:36.208 "w_mbytes_per_sec": 0 00:26:36.208 }, 00:26:36.208 "claimed": false, 00:26:36.208 "zoned": false, 00:26:36.208 "supported_io_types": { 00:26:36.208 "read": true, 00:26:36.208 "write": true, 00:26:36.208 "unmap": true, 00:26:36.208 "flush": false, 00:26:36.208 "reset": true, 00:26:36.208 "nvme_admin": false, 00:26:36.208 "nvme_io": false, 00:26:36.208 "nvme_io_md": false, 00:26:36.208 "write_zeroes": true, 00:26:36.208 "zcopy": false, 00:26:36.208 "get_zone_info": false, 00:26:36.208 "zone_management": false, 00:26:36.208 "zone_append": false, 00:26:36.208 "compare": false, 00:26:36.208 "compare_and_write": false, 00:26:36.208 "abort": false, 00:26:36.208 "seek_hole": true, 00:26:36.208 "seek_data": true, 00:26:36.208 "copy": false, 00:26:36.208 "nvme_iov_md": false 00:26:36.208 }, 00:26:36.208 "driver_specific": { 00:26:36.208 "lvol": { 00:26:36.208 "lvol_store_uuid": "38d9d922-1bff-48c2-82a9-8ccd3fbed3fb", 00:26:36.208 "base_bdev": "Nvme0n1", 00:26:36.208 "thin_provision": true, 00:26:36.208 "num_allocated_clusters": 0, 00:26:36.208 "snapshot": false, 00:26:36.208 "clone": false, 00:26:36.208 "esnap_clone": false 00:26:36.208 } 00:26:36.208 } 00:26:36.208 } 00:26:36.208 ] 00:26:36.208 17:38:47 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:26:36.208 17:38:47 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:36.208 17:38:47 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:36.469 [2024-07-15 17:38:47.515508] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:36.469 COMP_lvs0/lv0 00:26:36.469 17:38:47 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:36.469 17:38:47 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:26:36.469 17:38:47 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:36.469 17:38:47 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:26:36.469 17:38:47 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:36.469 17:38:47 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:36.469 17:38:47 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:36.469 17:38:47 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:36.730 [ 00:26:36.730 { 00:26:36.730 "name": "COMP_lvs0/lv0", 00:26:36.730 "aliases": [ 00:26:36.730 "1d08e652-db5f-5dae-b5bb-513915755c1e" 00:26:36.730 ], 00:26:36.730 "product_name": "compress", 00:26:36.730 "block_size": 512, 00:26:36.730 "num_blocks": 200704, 00:26:36.730 "uuid": "1d08e652-db5f-5dae-b5bb-513915755c1e", 00:26:36.730 "assigned_rate_limits": { 00:26:36.730 "rw_ios_per_sec": 0, 00:26:36.730 "rw_mbytes_per_sec": 0, 00:26:36.730 "r_mbytes_per_sec": 0, 00:26:36.730 "w_mbytes_per_sec": 0 00:26:36.730 }, 00:26:36.730 "claimed": false, 00:26:36.730 "zoned": false, 00:26:36.730 "supported_io_types": { 00:26:36.730 "read": true, 00:26:36.730 "write": true, 00:26:36.730 "unmap": false, 00:26:36.730 "flush": false, 00:26:36.730 "reset": false, 00:26:36.730 "nvme_admin": false, 00:26:36.730 "nvme_io": false, 00:26:36.730 "nvme_io_md": false, 00:26:36.730 "write_zeroes": true, 00:26:36.730 "zcopy": false, 00:26:36.730 "get_zone_info": false, 00:26:36.730 "zone_management": false, 00:26:36.730 "zone_append": false, 00:26:36.730 "compare": false, 00:26:36.730 "compare_and_write": false, 00:26:36.730 "abort": false, 00:26:36.730 "seek_hole": false, 00:26:36.730 "seek_data": false, 00:26:36.730 "copy": false, 00:26:36.730 "nvme_iov_md": false 00:26:36.730 }, 00:26:36.730 "driver_specific": { 00:26:36.730 "compress": { 00:26:36.730 "name": "COMP_lvs0/lv0", 00:26:36.730 "base_bdev_name": "63733ba1-dd00-4bfd-b49a-b34c977281fc" 00:26:36.730 } 00:26:36.730 } 00:26:36.730 } 00:26:36.730 ] 00:26:36.730 17:38:47 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:26:36.730 17:38:47 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:36.991 [2024-07-15 17:38:48.065045] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f6cc81b15c0 PMD being used: compress_qat 00:26:36.991 [2024-07-15 17:38:48.067832] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfd0cf0 PMD being used: compress_qat 00:26:36.991 Running I/O for 3 seconds... 00:26:40.289 00:26:40.289 Latency(us) 00:26:40.289 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:40.289 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:40.289 Verification LBA range: start 0x0 length 0x3100 00:26:40.289 COMP_lvs0/lv0 : 3.01 1532.62 5.99 0.00 0.00 20795.92 231.58 22282.24 00:26:40.289 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:40.289 Verification LBA range: start 0x3100 length 0x3100 00:26:40.289 COMP_lvs0/lv0 : 3.01 1609.84 6.29 0.00 0.00 19745.45 485.22 22383.06 00:26:40.289 =================================================================================================================== 00:26:40.289 Total : 3142.46 12.28 0.00 0.00 20258.09 231.58 22383.06 00:26:40.289 0 00:26:40.289 17:38:51 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:40.289 17:38:51 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:40.289 17:38:51 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:40.289 17:38:51 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:40.289 17:38:51 compress_compdev -- compress/compress.sh@78 -- # killprocess 2924576 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2924576 ']' 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2924576 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2924576 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2924576' 00:26:40.289 killing process with pid 2924576 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@967 -- # kill 2924576 00:26:40.289 Received shutdown signal, test time was about 3.000000 seconds 00:26:40.289 00:26:40.289 Latency(us) 00:26:40.289 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:40.289 =================================================================================================================== 00:26:40.289 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:40.289 17:38:51 compress_compdev -- common/autotest_common.sh@972 -- # wait 2924576 00:26:42.829 17:38:54 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:26:42.829 17:38:54 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:42.829 17:38:54 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2926669 00:26:42.829 17:38:54 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:42.829 17:38:54 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2926669 00:26:42.829 17:38:54 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:42.829 17:38:54 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2926669 ']' 00:26:42.829 17:38:54 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:42.829 17:38:54 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:42.830 17:38:54 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:42.830 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:42.830 17:38:54 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:42.830 17:38:54 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:42.830 [2024-07-15 17:38:54.096976] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:42.830 [2024-07-15 17:38:54.097046] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2926669 ] 00:26:43.089 [2024-07-15 17:38:54.177890] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:43.089 [2024-07-15 17:38:54.279232] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:43.089 [2024-07-15 17:38:54.279237] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:43.659 [2024-07-15 17:38:54.813213] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:43.920 17:38:54 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:43.920 17:38:54 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:26:43.920 17:38:54 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:26:43.920 17:38:54 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:43.920 17:38:54 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:47.215 [2024-07-15 17:38:58.003476] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24d1670 PMD being used: compress_qat 00:26:47.215 17:38:58 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:47.215 17:38:58 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:26:47.215 17:38:58 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:47.215 17:38:58 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:26:47.215 17:38:58 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:47.215 17:38:58 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:47.215 17:38:58 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:47.215 17:38:58 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:47.215 [ 00:26:47.215 { 00:26:47.215 "name": "Nvme0n1", 00:26:47.215 "aliases": [ 00:26:47.215 "6b0aecae-93a5-4010-9060-e749884a68cd" 00:26:47.215 ], 00:26:47.215 "product_name": "NVMe disk", 00:26:47.215 "block_size": 512, 00:26:47.215 "num_blocks": 3907029168, 00:26:47.215 "uuid": "6b0aecae-93a5-4010-9060-e749884a68cd", 00:26:47.215 "assigned_rate_limits": { 00:26:47.215 "rw_ios_per_sec": 0, 00:26:47.215 "rw_mbytes_per_sec": 0, 00:26:47.215 "r_mbytes_per_sec": 0, 00:26:47.215 "w_mbytes_per_sec": 0 00:26:47.215 }, 00:26:47.215 "claimed": false, 00:26:47.215 "zoned": false, 00:26:47.215 "supported_io_types": { 00:26:47.215 "read": true, 00:26:47.215 "write": true, 00:26:47.215 "unmap": true, 00:26:47.215 "flush": true, 00:26:47.215 "reset": true, 00:26:47.215 "nvme_admin": true, 00:26:47.215 "nvme_io": true, 00:26:47.215 "nvme_io_md": false, 00:26:47.215 "write_zeroes": true, 00:26:47.215 "zcopy": false, 00:26:47.215 "get_zone_info": false, 00:26:47.215 "zone_management": false, 00:26:47.215 "zone_append": false, 00:26:47.215 "compare": false, 00:26:47.215 "compare_and_write": false, 00:26:47.215 "abort": true, 00:26:47.215 "seek_hole": false, 00:26:47.215 "seek_data": false, 00:26:47.215 "copy": false, 00:26:47.215 "nvme_iov_md": false 00:26:47.215 }, 00:26:47.215 "driver_specific": { 00:26:47.215 "nvme": [ 00:26:47.215 { 00:26:47.215 "pci_address": "0000:65:00.0", 00:26:47.215 "trid": { 00:26:47.215 "trtype": "PCIe", 00:26:47.215 "traddr": "0000:65:00.0" 00:26:47.215 }, 00:26:47.215 "ctrlr_data": { 00:26:47.215 "cntlid": 0, 00:26:47.215 "vendor_id": "0x8086", 00:26:47.215 "model_number": "INTEL SSDPE2KX020T8", 00:26:47.215 "serial_number": "PHLJ9512038S2P0BGN", 00:26:47.215 "firmware_revision": "VDV10184", 00:26:47.215 "oacs": { 00:26:47.215 "security": 0, 00:26:47.215 "format": 1, 00:26:47.215 "firmware": 1, 00:26:47.215 "ns_manage": 1 00:26:47.215 }, 00:26:47.215 "multi_ctrlr": false, 00:26:47.215 "ana_reporting": false 00:26:47.215 }, 00:26:47.215 "vs": { 00:26:47.215 "nvme_version": "1.2" 00:26:47.215 }, 00:26:47.215 "ns_data": { 00:26:47.215 "id": 1, 00:26:47.215 "can_share": false 00:26:47.215 } 00:26:47.215 } 00:26:47.215 ], 00:26:47.215 "mp_policy": "active_passive" 00:26:47.215 } 00:26:47.215 } 00:26:47.215 ] 00:26:47.215 17:38:58 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:26:47.215 17:38:58 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:47.476 [2024-07-15 17:38:58.624848] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x231fa10 PMD being used: compress_qat 00:26:48.441 aacada3f-7576-4f28-891e-d160c6c6e9ec 00:26:48.441 17:38:59 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:48.702 75277a4a-c53b-4a6b-9ae0-7b9516e76782 00:26:48.702 17:38:59 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:48.702 17:38:59 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:26:48.702 17:38:59 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:48.702 17:38:59 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:26:48.702 17:38:59 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:48.702 17:38:59 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:48.702 17:38:59 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:48.963 17:39:00 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:49.225 [ 00:26:49.225 { 00:26:49.225 "name": "75277a4a-c53b-4a6b-9ae0-7b9516e76782", 00:26:49.225 "aliases": [ 00:26:49.225 "lvs0/lv0" 00:26:49.225 ], 00:26:49.225 "product_name": "Logical Volume", 00:26:49.225 "block_size": 512, 00:26:49.225 "num_blocks": 204800, 00:26:49.225 "uuid": "75277a4a-c53b-4a6b-9ae0-7b9516e76782", 00:26:49.225 "assigned_rate_limits": { 00:26:49.225 "rw_ios_per_sec": 0, 00:26:49.225 "rw_mbytes_per_sec": 0, 00:26:49.225 "r_mbytes_per_sec": 0, 00:26:49.225 "w_mbytes_per_sec": 0 00:26:49.225 }, 00:26:49.225 "claimed": false, 00:26:49.225 "zoned": false, 00:26:49.225 "supported_io_types": { 00:26:49.225 "read": true, 00:26:49.225 "write": true, 00:26:49.225 "unmap": true, 00:26:49.225 "flush": false, 00:26:49.225 "reset": true, 00:26:49.225 "nvme_admin": false, 00:26:49.225 "nvme_io": false, 00:26:49.225 "nvme_io_md": false, 00:26:49.225 "write_zeroes": true, 00:26:49.225 "zcopy": false, 00:26:49.225 "get_zone_info": false, 00:26:49.225 "zone_management": false, 00:26:49.225 "zone_append": false, 00:26:49.225 "compare": false, 00:26:49.225 "compare_and_write": false, 00:26:49.225 "abort": false, 00:26:49.225 "seek_hole": true, 00:26:49.225 "seek_data": true, 00:26:49.225 "copy": false, 00:26:49.225 "nvme_iov_md": false 00:26:49.225 }, 00:26:49.225 "driver_specific": { 00:26:49.225 "lvol": { 00:26:49.225 "lvol_store_uuid": "aacada3f-7576-4f28-891e-d160c6c6e9ec", 00:26:49.225 "base_bdev": "Nvme0n1", 00:26:49.225 "thin_provision": true, 00:26:49.225 "num_allocated_clusters": 0, 00:26:49.225 "snapshot": false, 00:26:49.225 "clone": false, 00:26:49.225 "esnap_clone": false 00:26:49.225 } 00:26:49.225 } 00:26:49.225 } 00:26:49.225 ] 00:26:49.225 17:39:00 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:26:49.225 17:39:00 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:26:49.225 17:39:00 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:26:49.486 [2024-07-15 17:39:00.528593] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:49.486 COMP_lvs0/lv0 00:26:49.486 17:39:00 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:49.486 17:39:00 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:26:49.486 17:39:00 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:49.486 17:39:00 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:26:49.486 17:39:00 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:49.486 17:39:00 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:49.486 17:39:00 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:49.486 17:39:00 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:49.748 [ 00:26:49.748 { 00:26:49.748 "name": "COMP_lvs0/lv0", 00:26:49.748 "aliases": [ 00:26:49.748 "70ab4869-eeb7-575c-ad39-aeb1ec97ee13" 00:26:49.748 ], 00:26:49.748 "product_name": "compress", 00:26:49.748 "block_size": 512, 00:26:49.748 "num_blocks": 200704, 00:26:49.748 "uuid": "70ab4869-eeb7-575c-ad39-aeb1ec97ee13", 00:26:49.748 "assigned_rate_limits": { 00:26:49.748 "rw_ios_per_sec": 0, 00:26:49.748 "rw_mbytes_per_sec": 0, 00:26:49.748 "r_mbytes_per_sec": 0, 00:26:49.748 "w_mbytes_per_sec": 0 00:26:49.748 }, 00:26:49.748 "claimed": false, 00:26:49.748 "zoned": false, 00:26:49.748 "supported_io_types": { 00:26:49.748 "read": true, 00:26:49.748 "write": true, 00:26:49.748 "unmap": false, 00:26:49.748 "flush": false, 00:26:49.748 "reset": false, 00:26:49.748 "nvme_admin": false, 00:26:49.748 "nvme_io": false, 00:26:49.748 "nvme_io_md": false, 00:26:49.748 "write_zeroes": true, 00:26:49.748 "zcopy": false, 00:26:49.748 "get_zone_info": false, 00:26:49.748 "zone_management": false, 00:26:49.748 "zone_append": false, 00:26:49.748 "compare": false, 00:26:49.748 "compare_and_write": false, 00:26:49.748 "abort": false, 00:26:49.748 "seek_hole": false, 00:26:49.748 "seek_data": false, 00:26:49.748 "copy": false, 00:26:49.748 "nvme_iov_md": false 00:26:49.748 }, 00:26:49.748 "driver_specific": { 00:26:49.748 "compress": { 00:26:49.748 "name": "COMP_lvs0/lv0", 00:26:49.748 "base_bdev_name": "75277a4a-c53b-4a6b-9ae0-7b9516e76782" 00:26:49.748 } 00:26:49.748 } 00:26:49.748 } 00:26:49.748 ] 00:26:49.748 17:39:00 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:26:49.748 17:39:00 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:50.009 [2024-07-15 17:39:01.066093] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7fe81b15c0 PMD being used: compress_qat 00:26:50.009 [2024-07-15 17:39:01.068906] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23b1540 PMD being used: compress_qat 00:26:50.009 Running I/O for 3 seconds... 00:26:53.309 00:26:53.309 Latency(us) 00:26:53.309 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:53.309 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:53.309 Verification LBA range: start 0x0 length 0x3100 00:26:53.309 COMP_lvs0/lv0 : 3.02 1527.69 5.97 0.00 0.00 20849.16 348.16 24802.86 00:26:53.309 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:53.309 Verification LBA range: start 0x3100 length 0x3100 00:26:53.309 COMP_lvs0/lv0 : 3.01 1614.27 6.31 0.00 0.00 19694.45 269.39 22584.71 00:26:53.309 =================================================================================================================== 00:26:53.309 Total : 3141.96 12.27 0.00 0.00 20256.21 269.39 24802.86 00:26:53.309 0 00:26:53.309 17:39:04 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:53.309 17:39:04 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:53.309 17:39:04 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:53.309 17:39:04 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:53.309 17:39:04 compress_compdev -- compress/compress.sh@78 -- # killprocess 2926669 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2926669 ']' 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2926669 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2926669 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2926669' 00:26:53.309 killing process with pid 2926669 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@967 -- # kill 2926669 00:26:53.309 Received shutdown signal, test time was about 3.000000 seconds 00:26:53.309 00:26:53.309 Latency(us) 00:26:53.309 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:53.309 =================================================================================================================== 00:26:53.309 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:53.309 17:39:04 compress_compdev -- common/autotest_common.sh@972 -- # wait 2926669 00:26:55.935 17:39:07 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:26:55.935 17:39:07 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:55.935 17:39:07 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2928937 00:26:55.935 17:39:07 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:55.935 17:39:07 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2928937 00:26:55.935 17:39:07 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2928937 ']' 00:26:55.935 17:39:07 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:55.935 17:39:07 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:55.935 17:39:07 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:55.935 17:39:07 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:55.935 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:55.935 17:39:07 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:55.935 17:39:07 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:55.935 [2024-07-15 17:39:07.077943] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:26:55.935 [2024-07-15 17:39:07.078011] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2928937 ] 00:26:55.935 [2024-07-15 17:39:07.158420] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:56.194 [2024-07-15 17:39:07.260115] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:56.194 [2024-07-15 17:39:07.260119] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:56.764 [2024-07-15 17:39:07.806310] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:56.764 17:39:07 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:56.764 17:39:07 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:26:56.764 17:39:07 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:26:56.764 17:39:07 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:56.764 17:39:07 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:00.059 [2024-07-15 17:39:10.979622] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2a34670 PMD being used: compress_qat 00:27:00.059 17:39:11 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:00.059 17:39:11 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:00.059 17:39:11 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:00.059 17:39:11 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:00.059 17:39:11 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:00.059 17:39:11 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:00.059 17:39:11 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:00.059 17:39:11 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:00.319 [ 00:27:00.319 { 00:27:00.319 "name": "Nvme0n1", 00:27:00.319 "aliases": [ 00:27:00.319 "f8e16092-fb4e-4295-9f9a-1e5a398cf70d" 00:27:00.319 ], 00:27:00.319 "product_name": "NVMe disk", 00:27:00.319 "block_size": 512, 00:27:00.319 "num_blocks": 3907029168, 00:27:00.319 "uuid": "f8e16092-fb4e-4295-9f9a-1e5a398cf70d", 00:27:00.319 "assigned_rate_limits": { 00:27:00.319 "rw_ios_per_sec": 0, 00:27:00.319 "rw_mbytes_per_sec": 0, 00:27:00.319 "r_mbytes_per_sec": 0, 00:27:00.319 "w_mbytes_per_sec": 0 00:27:00.319 }, 00:27:00.319 "claimed": false, 00:27:00.319 "zoned": false, 00:27:00.319 "supported_io_types": { 00:27:00.319 "read": true, 00:27:00.319 "write": true, 00:27:00.319 "unmap": true, 00:27:00.319 "flush": true, 00:27:00.319 "reset": true, 00:27:00.319 "nvme_admin": true, 00:27:00.319 "nvme_io": true, 00:27:00.319 "nvme_io_md": false, 00:27:00.319 "write_zeroes": true, 00:27:00.319 "zcopy": false, 00:27:00.319 "get_zone_info": false, 00:27:00.319 "zone_management": false, 00:27:00.319 "zone_append": false, 00:27:00.319 "compare": false, 00:27:00.319 "compare_and_write": false, 00:27:00.319 "abort": true, 00:27:00.319 "seek_hole": false, 00:27:00.319 "seek_data": false, 00:27:00.319 "copy": false, 00:27:00.319 "nvme_iov_md": false 00:27:00.319 }, 00:27:00.319 "driver_specific": { 00:27:00.319 "nvme": [ 00:27:00.319 { 00:27:00.319 "pci_address": "0000:65:00.0", 00:27:00.319 "trid": { 00:27:00.319 "trtype": "PCIe", 00:27:00.319 "traddr": "0000:65:00.0" 00:27:00.319 }, 00:27:00.319 "ctrlr_data": { 00:27:00.319 "cntlid": 0, 00:27:00.319 "vendor_id": "0x8086", 00:27:00.319 "model_number": "INTEL SSDPE2KX020T8", 00:27:00.319 "serial_number": "PHLJ9512038S2P0BGN", 00:27:00.319 "firmware_revision": "VDV10184", 00:27:00.319 "oacs": { 00:27:00.319 "security": 0, 00:27:00.319 "format": 1, 00:27:00.319 "firmware": 1, 00:27:00.319 "ns_manage": 1 00:27:00.319 }, 00:27:00.319 "multi_ctrlr": false, 00:27:00.319 "ana_reporting": false 00:27:00.319 }, 00:27:00.319 "vs": { 00:27:00.319 "nvme_version": "1.2" 00:27:00.319 }, 00:27:00.319 "ns_data": { 00:27:00.319 "id": 1, 00:27:00.319 "can_share": false 00:27:00.319 } 00:27:00.319 } 00:27:00.319 ], 00:27:00.319 "mp_policy": "active_passive" 00:27:00.319 } 00:27:00.319 } 00:27:00.319 ] 00:27:00.319 17:39:11 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:00.319 17:39:11 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:00.319 [2024-07-15 17:39:11.613354] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2882a10 PMD being used: compress_qat 00:27:01.702 f46b16c3-f3b5-4970-a563-d205702425f9 00:27:01.702 17:39:12 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:01.702 57151623-f646-42b6-9a90-f78aff7d2727 00:27:01.702 17:39:12 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:01.702 17:39:12 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:27:01.702 17:39:12 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:01.702 17:39:12 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:01.702 17:39:12 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:01.702 17:39:12 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:01.702 17:39:12 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:01.962 17:39:13 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:02.221 [ 00:27:02.221 { 00:27:02.221 "name": "57151623-f646-42b6-9a90-f78aff7d2727", 00:27:02.221 "aliases": [ 00:27:02.221 "lvs0/lv0" 00:27:02.221 ], 00:27:02.221 "product_name": "Logical Volume", 00:27:02.221 "block_size": 512, 00:27:02.221 "num_blocks": 204800, 00:27:02.221 "uuid": "57151623-f646-42b6-9a90-f78aff7d2727", 00:27:02.221 "assigned_rate_limits": { 00:27:02.221 "rw_ios_per_sec": 0, 00:27:02.221 "rw_mbytes_per_sec": 0, 00:27:02.221 "r_mbytes_per_sec": 0, 00:27:02.221 "w_mbytes_per_sec": 0 00:27:02.221 }, 00:27:02.221 "claimed": false, 00:27:02.221 "zoned": false, 00:27:02.221 "supported_io_types": { 00:27:02.221 "read": true, 00:27:02.221 "write": true, 00:27:02.221 "unmap": true, 00:27:02.221 "flush": false, 00:27:02.221 "reset": true, 00:27:02.221 "nvme_admin": false, 00:27:02.221 "nvme_io": false, 00:27:02.221 "nvme_io_md": false, 00:27:02.221 "write_zeroes": true, 00:27:02.221 "zcopy": false, 00:27:02.221 "get_zone_info": false, 00:27:02.221 "zone_management": false, 00:27:02.221 "zone_append": false, 00:27:02.221 "compare": false, 00:27:02.221 "compare_and_write": false, 00:27:02.221 "abort": false, 00:27:02.221 "seek_hole": true, 00:27:02.221 "seek_data": true, 00:27:02.221 "copy": false, 00:27:02.221 "nvme_iov_md": false 00:27:02.221 }, 00:27:02.221 "driver_specific": { 00:27:02.221 "lvol": { 00:27:02.221 "lvol_store_uuid": "f46b16c3-f3b5-4970-a563-d205702425f9", 00:27:02.221 "base_bdev": "Nvme0n1", 00:27:02.221 "thin_provision": true, 00:27:02.221 "num_allocated_clusters": 0, 00:27:02.221 "snapshot": false, 00:27:02.221 "clone": false, 00:27:02.221 "esnap_clone": false 00:27:02.221 } 00:27:02.221 } 00:27:02.221 } 00:27:02.221 ] 00:27:02.221 17:39:13 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:02.221 17:39:13 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:27:02.221 17:39:13 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:27:02.482 [2024-07-15 17:39:13.526143] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:02.482 COMP_lvs0/lv0 00:27:02.482 17:39:13 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:02.482 17:39:13 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:27:02.482 17:39:13 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:02.482 17:39:13 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:02.482 17:39:13 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:02.482 17:39:13 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:02.482 17:39:13 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:02.482 17:39:13 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:02.742 [ 00:27:02.742 { 00:27:02.742 "name": "COMP_lvs0/lv0", 00:27:02.742 "aliases": [ 00:27:02.742 "04533267-4344-5b6a-bfb2-5ccec5c110dc" 00:27:02.742 ], 00:27:02.742 "product_name": "compress", 00:27:02.742 "block_size": 4096, 00:27:02.742 "num_blocks": 25088, 00:27:02.742 "uuid": "04533267-4344-5b6a-bfb2-5ccec5c110dc", 00:27:02.742 "assigned_rate_limits": { 00:27:02.742 "rw_ios_per_sec": 0, 00:27:02.742 "rw_mbytes_per_sec": 0, 00:27:02.742 "r_mbytes_per_sec": 0, 00:27:02.742 "w_mbytes_per_sec": 0 00:27:02.742 }, 00:27:02.742 "claimed": false, 00:27:02.742 "zoned": false, 00:27:02.742 "supported_io_types": { 00:27:02.742 "read": true, 00:27:02.742 "write": true, 00:27:02.742 "unmap": false, 00:27:02.742 "flush": false, 00:27:02.742 "reset": false, 00:27:02.742 "nvme_admin": false, 00:27:02.742 "nvme_io": false, 00:27:02.742 "nvme_io_md": false, 00:27:02.742 "write_zeroes": true, 00:27:02.742 "zcopy": false, 00:27:02.742 "get_zone_info": false, 00:27:02.742 "zone_management": false, 00:27:02.742 "zone_append": false, 00:27:02.742 "compare": false, 00:27:02.742 "compare_and_write": false, 00:27:02.742 "abort": false, 00:27:02.742 "seek_hole": false, 00:27:02.742 "seek_data": false, 00:27:02.742 "copy": false, 00:27:02.742 "nvme_iov_md": false 00:27:02.742 }, 00:27:02.742 "driver_specific": { 00:27:02.742 "compress": { 00:27:02.742 "name": "COMP_lvs0/lv0", 00:27:02.742 "base_bdev_name": "57151623-f646-42b6-9a90-f78aff7d2727" 00:27:02.742 } 00:27:02.742 } 00:27:02.742 } 00:27:02.742 ] 00:27:02.742 17:39:13 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:02.742 17:39:13 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:03.003 [2024-07-15 17:39:14.051597] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f78e81b15c0 PMD being used: compress_qat 00:27:03.003 [2024-07-15 17:39:14.054446] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2914540 PMD being used: compress_qat 00:27:03.003 Running I/O for 3 seconds... 00:27:06.319 00:27:06.319 Latency(us) 00:27:06.319 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:06.319 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:06.319 Verification LBA range: start 0x0 length 0x3100 00:27:06.319 COMP_lvs0/lv0 : 3.01 1542.52 6.03 0.00 0.00 20663.12 463.16 22786.36 00:27:06.319 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:06.319 Verification LBA range: start 0x3100 length 0x3100 00:27:06.319 COMP_lvs0/lv0 : 3.01 1616.84 6.32 0.00 0.00 19664.46 560.84 22080.59 00:27:06.319 =================================================================================================================== 00:27:06.319 Total : 3159.36 12.34 0.00 0.00 20151.99 463.16 22786.36 00:27:06.319 0 00:27:06.319 17:39:17 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:27:06.319 17:39:17 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:06.319 17:39:17 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:06.320 17:39:17 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:06.320 17:39:17 compress_compdev -- compress/compress.sh@78 -- # killprocess 2928937 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2928937 ']' 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2928937 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2928937 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2928937' 00:27:06.320 killing process with pid 2928937 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@967 -- # kill 2928937 00:27:06.320 Received shutdown signal, test time was about 3.000000 seconds 00:27:06.320 00:27:06.320 Latency(us) 00:27:06.320 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:06.320 =================================================================================================================== 00:27:06.320 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:06.320 17:39:17 compress_compdev -- common/autotest_common.sh@972 -- # wait 2928937 00:27:08.861 17:39:20 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:27:08.861 17:39:20 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:27:08.861 17:39:20 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=2931540 00:27:08.861 17:39:20 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:08.861 17:39:20 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 2931540 00:27:08.861 17:39:20 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:27:08.861 17:39:20 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2931540 ']' 00:27:08.861 17:39:20 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:08.861 17:39:20 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:08.861 17:39:20 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:08.861 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:08.861 17:39:20 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:08.861 17:39:20 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:08.861 [2024-07-15 17:39:20.128790] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:27:08.861 [2024-07-15 17:39:20.128856] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2931540 ] 00:27:09.121 [2024-07-15 17:39:20.223295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:09.121 [2024-07-15 17:39:20.320681] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:09.121 [2024-07-15 17:39:20.320842] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:09.122 [2024-07-15 17:39:20.320985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:09.693 [2024-07-15 17:39:20.785705] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:27:09.952 17:39:21 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:09.952 17:39:21 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:27:09.952 17:39:21 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:27:09.952 17:39:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:09.952 17:39:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:13.243 [2024-07-15 17:39:24.044145] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e860c0 PMD being used: compress_qat 00:27:13.243 17:39:24 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:13.243 17:39:24 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:13.243 17:39:24 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:13.243 17:39:24 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:13.243 17:39:24 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:13.243 17:39:24 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:13.243 17:39:24 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:13.243 17:39:24 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:13.243 [ 00:27:13.243 { 00:27:13.243 "name": "Nvme0n1", 00:27:13.243 "aliases": [ 00:27:13.243 "2990ea04-2ba5-43a9-acd1-0bb52201c595" 00:27:13.243 ], 00:27:13.243 "product_name": "NVMe disk", 00:27:13.243 "block_size": 512, 00:27:13.243 "num_blocks": 3907029168, 00:27:13.243 "uuid": "2990ea04-2ba5-43a9-acd1-0bb52201c595", 00:27:13.243 "assigned_rate_limits": { 00:27:13.243 "rw_ios_per_sec": 0, 00:27:13.243 "rw_mbytes_per_sec": 0, 00:27:13.243 "r_mbytes_per_sec": 0, 00:27:13.243 "w_mbytes_per_sec": 0 00:27:13.243 }, 00:27:13.243 "claimed": false, 00:27:13.243 "zoned": false, 00:27:13.243 "supported_io_types": { 00:27:13.243 "read": true, 00:27:13.243 "write": true, 00:27:13.243 "unmap": true, 00:27:13.243 "flush": true, 00:27:13.243 "reset": true, 00:27:13.243 "nvme_admin": true, 00:27:13.243 "nvme_io": true, 00:27:13.243 "nvme_io_md": false, 00:27:13.243 "write_zeroes": true, 00:27:13.243 "zcopy": false, 00:27:13.243 "get_zone_info": false, 00:27:13.243 "zone_management": false, 00:27:13.243 "zone_append": false, 00:27:13.243 "compare": false, 00:27:13.243 "compare_and_write": false, 00:27:13.243 "abort": true, 00:27:13.243 "seek_hole": false, 00:27:13.243 "seek_data": false, 00:27:13.243 "copy": false, 00:27:13.243 "nvme_iov_md": false 00:27:13.243 }, 00:27:13.243 "driver_specific": { 00:27:13.243 "nvme": [ 00:27:13.243 { 00:27:13.243 "pci_address": "0000:65:00.0", 00:27:13.243 "trid": { 00:27:13.243 "trtype": "PCIe", 00:27:13.243 "traddr": "0000:65:00.0" 00:27:13.243 }, 00:27:13.243 "ctrlr_data": { 00:27:13.243 "cntlid": 0, 00:27:13.243 "vendor_id": "0x8086", 00:27:13.243 "model_number": "INTEL SSDPE2KX020T8", 00:27:13.243 "serial_number": "PHLJ9512038S2P0BGN", 00:27:13.243 "firmware_revision": "VDV10184", 00:27:13.243 "oacs": { 00:27:13.243 "security": 0, 00:27:13.243 "format": 1, 00:27:13.243 "firmware": 1, 00:27:13.243 "ns_manage": 1 00:27:13.243 }, 00:27:13.243 "multi_ctrlr": false, 00:27:13.243 "ana_reporting": false 00:27:13.243 }, 00:27:13.243 "vs": { 00:27:13.243 "nvme_version": "1.2" 00:27:13.243 }, 00:27:13.243 "ns_data": { 00:27:13.243 "id": 1, 00:27:13.243 "can_share": false 00:27:13.243 } 00:27:13.243 } 00:27:13.243 ], 00:27:13.243 "mp_policy": "active_passive" 00:27:13.243 } 00:27:13.243 } 00:27:13.243 ] 00:27:13.243 17:39:24 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:13.243 17:39:24 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:13.505 [2024-07-15 17:39:24.721842] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1cd44b0 PMD being used: compress_qat 00:27:14.892 5326c685-81c8-47c8-a7ff-8d59764179a4 00:27:14.892 17:39:25 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:14.892 382037c8-5d58-4127-b1fc-00cfbde67bce 00:27:14.892 17:39:26 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:14.892 17:39:26 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:27:14.892 17:39:26 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:14.892 17:39:26 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:14.892 17:39:26 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:14.892 17:39:26 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:14.893 17:39:26 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:15.153 17:39:26 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:15.153 [ 00:27:15.153 { 00:27:15.153 "name": "382037c8-5d58-4127-b1fc-00cfbde67bce", 00:27:15.153 "aliases": [ 00:27:15.153 "lvs0/lv0" 00:27:15.153 ], 00:27:15.153 "product_name": "Logical Volume", 00:27:15.153 "block_size": 512, 00:27:15.153 "num_blocks": 204800, 00:27:15.153 "uuid": "382037c8-5d58-4127-b1fc-00cfbde67bce", 00:27:15.153 "assigned_rate_limits": { 00:27:15.153 "rw_ios_per_sec": 0, 00:27:15.153 "rw_mbytes_per_sec": 0, 00:27:15.153 "r_mbytes_per_sec": 0, 00:27:15.153 "w_mbytes_per_sec": 0 00:27:15.153 }, 00:27:15.153 "claimed": false, 00:27:15.153 "zoned": false, 00:27:15.153 "supported_io_types": { 00:27:15.153 "read": true, 00:27:15.153 "write": true, 00:27:15.153 "unmap": true, 00:27:15.153 "flush": false, 00:27:15.153 "reset": true, 00:27:15.153 "nvme_admin": false, 00:27:15.153 "nvme_io": false, 00:27:15.153 "nvme_io_md": false, 00:27:15.153 "write_zeroes": true, 00:27:15.153 "zcopy": false, 00:27:15.153 "get_zone_info": false, 00:27:15.153 "zone_management": false, 00:27:15.153 "zone_append": false, 00:27:15.153 "compare": false, 00:27:15.153 "compare_and_write": false, 00:27:15.153 "abort": false, 00:27:15.153 "seek_hole": true, 00:27:15.153 "seek_data": true, 00:27:15.153 "copy": false, 00:27:15.153 "nvme_iov_md": false 00:27:15.153 }, 00:27:15.153 "driver_specific": { 00:27:15.153 "lvol": { 00:27:15.153 "lvol_store_uuid": "5326c685-81c8-47c8-a7ff-8d59764179a4", 00:27:15.153 "base_bdev": "Nvme0n1", 00:27:15.153 "thin_provision": true, 00:27:15.153 "num_allocated_clusters": 0, 00:27:15.153 "snapshot": false, 00:27:15.153 "clone": false, 00:27:15.153 "esnap_clone": false 00:27:15.153 } 00:27:15.153 } 00:27:15.153 } 00:27:15.153 ] 00:27:15.153 17:39:26 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:15.153 17:39:26 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:15.153 17:39:26 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:15.413 [2024-07-15 17:39:26.596202] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:15.413 COMP_lvs0/lv0 00:27:15.413 17:39:26 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:15.413 17:39:26 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:27:15.413 17:39:26 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:15.413 17:39:26 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:15.413 17:39:26 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:15.413 17:39:26 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:15.413 17:39:26 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:15.674 17:39:26 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:15.934 [ 00:27:15.934 { 00:27:15.934 "name": "COMP_lvs0/lv0", 00:27:15.934 "aliases": [ 00:27:15.934 "d4bff506-9a35-50b6-8f06-5874afee4119" 00:27:15.934 ], 00:27:15.934 "product_name": "compress", 00:27:15.934 "block_size": 512, 00:27:15.934 "num_blocks": 200704, 00:27:15.934 "uuid": "d4bff506-9a35-50b6-8f06-5874afee4119", 00:27:15.934 "assigned_rate_limits": { 00:27:15.934 "rw_ios_per_sec": 0, 00:27:15.934 "rw_mbytes_per_sec": 0, 00:27:15.934 "r_mbytes_per_sec": 0, 00:27:15.934 "w_mbytes_per_sec": 0 00:27:15.934 }, 00:27:15.934 "claimed": false, 00:27:15.934 "zoned": false, 00:27:15.934 "supported_io_types": { 00:27:15.934 "read": true, 00:27:15.934 "write": true, 00:27:15.934 "unmap": false, 00:27:15.934 "flush": false, 00:27:15.934 "reset": false, 00:27:15.934 "nvme_admin": false, 00:27:15.934 "nvme_io": false, 00:27:15.934 "nvme_io_md": false, 00:27:15.934 "write_zeroes": true, 00:27:15.934 "zcopy": false, 00:27:15.934 "get_zone_info": false, 00:27:15.934 "zone_management": false, 00:27:15.934 "zone_append": false, 00:27:15.934 "compare": false, 00:27:15.934 "compare_and_write": false, 00:27:15.934 "abort": false, 00:27:15.934 "seek_hole": false, 00:27:15.934 "seek_data": false, 00:27:15.934 "copy": false, 00:27:15.934 "nvme_iov_md": false 00:27:15.934 }, 00:27:15.934 "driver_specific": { 00:27:15.934 "compress": { 00:27:15.934 "name": "COMP_lvs0/lv0", 00:27:15.934 "base_bdev_name": "382037c8-5d58-4127-b1fc-00cfbde67bce" 00:27:15.934 } 00:27:15.934 } 00:27:15.934 } 00:27:15.934 ] 00:27:15.934 17:39:27 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:15.934 17:39:27 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:15.934 [2024-07-15 17:39:27.137938] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc0801b1350 PMD being used: compress_qat 00:27:15.934 I/O targets: 00:27:15.934 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:27:15.934 00:27:15.934 00:27:15.934 CUnit - A unit testing framework for C - Version 2.1-3 00:27:15.934 http://cunit.sourceforge.net/ 00:27:15.934 00:27:15.934 00:27:15.934 Suite: bdevio tests on: COMP_lvs0/lv0 00:27:15.934 Test: blockdev write read block ...passed 00:27:15.934 Test: blockdev write zeroes read block ...passed 00:27:15.934 Test: blockdev write zeroes read no split ...passed 00:27:15.934 Test: blockdev write zeroes read split ...passed 00:27:16.194 Test: blockdev write zeroes read split partial ...passed 00:27:16.194 Test: blockdev reset ...[2024-07-15 17:39:27.272193] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:27:16.194 passed 00:27:16.194 Test: blockdev write read 8 blocks ...passed 00:27:16.194 Test: blockdev write read size > 128k ...passed 00:27:16.194 Test: blockdev write read invalid size ...passed 00:27:16.194 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:16.194 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:16.194 Test: blockdev write read max offset ...passed 00:27:16.194 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:16.194 Test: blockdev writev readv 8 blocks ...passed 00:27:16.194 Test: blockdev writev readv 30 x 1block ...passed 00:27:16.194 Test: blockdev writev readv block ...passed 00:27:16.194 Test: blockdev writev readv size > 128k ...passed 00:27:16.194 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:16.194 Test: blockdev comparev and writev ...passed 00:27:16.194 Test: blockdev nvme passthru rw ...passed 00:27:16.194 Test: blockdev nvme passthru vendor specific ...passed 00:27:16.194 Test: blockdev nvme admin passthru ...passed 00:27:16.194 Test: blockdev copy ...passed 00:27:16.194 00:27:16.194 Run Summary: Type Total Ran Passed Failed Inactive 00:27:16.194 suites 1 1 n/a 0 0 00:27:16.194 tests 23 23 23 0 0 00:27:16.194 asserts 130 130 130 0 n/a 00:27:16.194 00:27:16.194 Elapsed time = 0.359 seconds 00:27:16.194 0 00:27:16.194 17:39:27 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:27:16.194 17:39:27 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:16.454 17:39:27 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:16.713 17:39:27 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:27:16.713 17:39:27 compress_compdev -- compress/compress.sh@62 -- # killprocess 2931540 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2931540 ']' 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2931540 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2931540 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2931540' 00:27:16.713 killing process with pid 2931540 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@967 -- # kill 2931540 00:27:16.713 17:39:27 compress_compdev -- common/autotest_common.sh@972 -- # wait 2931540 00:27:19.298 17:39:30 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:27:19.298 17:39:30 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:27:19.298 00:27:19.298 real 0m49.362s 00:27:19.298 user 1m51.483s 00:27:19.298 sys 0m4.391s 00:27:19.298 17:39:30 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:19.298 17:39:30 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:19.298 ************************************ 00:27:19.298 END TEST compress_compdev 00:27:19.298 ************************************ 00:27:19.298 17:39:30 -- common/autotest_common.sh@1142 -- # return 0 00:27:19.298 17:39:30 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:27:19.298 17:39:30 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:19.298 17:39:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:19.298 17:39:30 -- common/autotest_common.sh@10 -- # set +x 00:27:19.298 ************************************ 00:27:19.298 START TEST compress_isal 00:27:19.298 ************************************ 00:27:19.298 17:39:30 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:27:19.298 * Looking for test storage... 00:27:19.298 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:27:19.298 17:39:30 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:27:19.298 17:39:30 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:27:19.298 17:39:30 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:27:19.298 17:39:30 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:27:19.299 17:39:30 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:27:19.299 17:39:30 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:19.299 17:39:30 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:19.299 17:39:30 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:19.299 17:39:30 compress_isal -- paths/export.sh@5 -- # export PATH 00:27:19.299 17:39:30 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:19.299 17:39:30 compress_isal -- nvmf/common.sh@47 -- # : 0 00:27:19.299 17:39:30 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:27:19.299 17:39:30 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:27:19.299 17:39:30 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:27:19.299 17:39:30 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:27:19.299 17:39:30 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:27:19.299 17:39:30 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:27:19.299 17:39:30 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:27:19.299 17:39:30 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:27:19.299 17:39:30 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:19.299 17:39:30 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:27:19.299 17:39:30 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:27:19.299 17:39:30 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:27:19.299 17:39:30 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:19.299 17:39:30 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2933160 00:27:19.299 17:39:30 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:19.299 17:39:30 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2933160 00:27:19.299 17:39:30 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:19.299 17:39:30 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2933160 ']' 00:27:19.299 17:39:30 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:19.299 17:39:30 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:19.299 17:39:30 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:19.299 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:19.299 17:39:30 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:19.299 17:39:30 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:19.299 [2024-07-15 17:39:30.513408] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:27:19.299 [2024-07-15 17:39:30.513471] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2933160 ] 00:27:19.560 [2024-07-15 17:39:30.596127] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:19.560 [2024-07-15 17:39:30.697706] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:19.560 [2024-07-15 17:39:30.697721] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:20.526 17:39:31 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:20.526 17:39:31 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:27:20.526 17:39:31 compress_isal -- compress/compress.sh@74 -- # create_vols 00:27:20.526 17:39:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:20.526 17:39:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:23.824 17:39:34 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:23.824 17:39:34 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:23.824 17:39:34 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:23.824 17:39:34 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:23.824 17:39:34 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:23.824 17:39:34 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:23.824 17:39:34 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:24.084 17:39:35 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:24.345 [ 00:27:24.345 { 00:27:24.345 "name": "Nvme0n1", 00:27:24.345 "aliases": [ 00:27:24.345 "15bab9ce-6c4b-404e-8ec0-c8bf30dc53a5" 00:27:24.345 ], 00:27:24.345 "product_name": "NVMe disk", 00:27:24.345 "block_size": 512, 00:27:24.345 "num_blocks": 3907029168, 00:27:24.345 "uuid": "15bab9ce-6c4b-404e-8ec0-c8bf30dc53a5", 00:27:24.345 "assigned_rate_limits": { 00:27:24.345 "rw_ios_per_sec": 0, 00:27:24.345 "rw_mbytes_per_sec": 0, 00:27:24.345 "r_mbytes_per_sec": 0, 00:27:24.345 "w_mbytes_per_sec": 0 00:27:24.345 }, 00:27:24.345 "claimed": false, 00:27:24.345 "zoned": false, 00:27:24.345 "supported_io_types": { 00:27:24.345 "read": true, 00:27:24.345 "write": true, 00:27:24.345 "unmap": true, 00:27:24.345 "flush": true, 00:27:24.345 "reset": true, 00:27:24.345 "nvme_admin": true, 00:27:24.345 "nvme_io": true, 00:27:24.345 "nvme_io_md": false, 00:27:24.345 "write_zeroes": true, 00:27:24.345 "zcopy": false, 00:27:24.345 "get_zone_info": false, 00:27:24.345 "zone_management": false, 00:27:24.345 "zone_append": false, 00:27:24.345 "compare": false, 00:27:24.345 "compare_and_write": false, 00:27:24.345 "abort": true, 00:27:24.345 "seek_hole": false, 00:27:24.345 "seek_data": false, 00:27:24.345 "copy": false, 00:27:24.345 "nvme_iov_md": false 00:27:24.345 }, 00:27:24.345 "driver_specific": { 00:27:24.345 "nvme": [ 00:27:24.345 { 00:27:24.345 "pci_address": "0000:65:00.0", 00:27:24.345 "trid": { 00:27:24.345 "trtype": "PCIe", 00:27:24.345 "traddr": "0000:65:00.0" 00:27:24.345 }, 00:27:24.345 "ctrlr_data": { 00:27:24.345 "cntlid": 0, 00:27:24.345 "vendor_id": "0x8086", 00:27:24.345 "model_number": "INTEL SSDPE2KX020T8", 00:27:24.345 "serial_number": "PHLJ9512038S2P0BGN", 00:27:24.345 "firmware_revision": "VDV10184", 00:27:24.345 "oacs": { 00:27:24.345 "security": 0, 00:27:24.345 "format": 1, 00:27:24.345 "firmware": 1, 00:27:24.345 "ns_manage": 1 00:27:24.345 }, 00:27:24.345 "multi_ctrlr": false, 00:27:24.345 "ana_reporting": false 00:27:24.345 }, 00:27:24.345 "vs": { 00:27:24.345 "nvme_version": "1.2" 00:27:24.345 }, 00:27:24.345 "ns_data": { 00:27:24.345 "id": 1, 00:27:24.345 "can_share": false 00:27:24.345 } 00:27:24.345 } 00:27:24.345 ], 00:27:24.345 "mp_policy": "active_passive" 00:27:24.345 } 00:27:24.345 } 00:27:24.345 ] 00:27:24.345 17:39:35 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:24.345 17:39:35 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:26.257 6783f114-6043-4753-b54a-004ecd3c77fc 00:27:26.257 17:39:37 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:26.257 3b750381-1ef2-4a86-bac9-a9d7f0633be0 00:27:26.257 17:39:37 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:26.257 17:39:37 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:27:26.257 17:39:37 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:26.257 17:39:37 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:26.257 17:39:37 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:26.257 17:39:37 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:26.257 17:39:37 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:26.519 17:39:37 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:26.519 [ 00:27:26.519 { 00:27:26.519 "name": "3b750381-1ef2-4a86-bac9-a9d7f0633be0", 00:27:26.519 "aliases": [ 00:27:26.519 "lvs0/lv0" 00:27:26.519 ], 00:27:26.519 "product_name": "Logical Volume", 00:27:26.519 "block_size": 512, 00:27:26.519 "num_blocks": 204800, 00:27:26.519 "uuid": "3b750381-1ef2-4a86-bac9-a9d7f0633be0", 00:27:26.519 "assigned_rate_limits": { 00:27:26.519 "rw_ios_per_sec": 0, 00:27:26.519 "rw_mbytes_per_sec": 0, 00:27:26.519 "r_mbytes_per_sec": 0, 00:27:26.519 "w_mbytes_per_sec": 0 00:27:26.519 }, 00:27:26.519 "claimed": false, 00:27:26.519 "zoned": false, 00:27:26.519 "supported_io_types": { 00:27:26.519 "read": true, 00:27:26.519 "write": true, 00:27:26.519 "unmap": true, 00:27:26.519 "flush": false, 00:27:26.519 "reset": true, 00:27:26.519 "nvme_admin": false, 00:27:26.519 "nvme_io": false, 00:27:26.519 "nvme_io_md": false, 00:27:26.519 "write_zeroes": true, 00:27:26.519 "zcopy": false, 00:27:26.519 "get_zone_info": false, 00:27:26.519 "zone_management": false, 00:27:26.519 "zone_append": false, 00:27:26.519 "compare": false, 00:27:26.519 "compare_and_write": false, 00:27:26.519 "abort": false, 00:27:26.519 "seek_hole": true, 00:27:26.519 "seek_data": true, 00:27:26.519 "copy": false, 00:27:26.519 "nvme_iov_md": false 00:27:26.519 }, 00:27:26.519 "driver_specific": { 00:27:26.519 "lvol": { 00:27:26.519 "lvol_store_uuid": "6783f114-6043-4753-b54a-004ecd3c77fc", 00:27:26.519 "base_bdev": "Nvme0n1", 00:27:26.519 "thin_provision": true, 00:27:26.519 "num_allocated_clusters": 0, 00:27:26.519 "snapshot": false, 00:27:26.519 "clone": false, 00:27:26.519 "esnap_clone": false 00:27:26.519 } 00:27:26.519 } 00:27:26.519 } 00:27:26.519 ] 00:27:26.519 17:39:37 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:26.519 17:39:37 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:26.519 17:39:37 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:26.799 [2024-07-15 17:39:37.976678] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:26.799 COMP_lvs0/lv0 00:27:26.799 17:39:38 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:26.799 17:39:38 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:27:26.799 17:39:38 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:26.799 17:39:38 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:26.799 17:39:38 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:26.799 17:39:38 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:26.799 17:39:38 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:27.060 17:39:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:27.632 [ 00:27:27.632 { 00:27:27.632 "name": "COMP_lvs0/lv0", 00:27:27.632 "aliases": [ 00:27:27.632 "25c3befd-e9ac-5f0a-8619-965451021bc0" 00:27:27.632 ], 00:27:27.632 "product_name": "compress", 00:27:27.632 "block_size": 512, 00:27:27.632 "num_blocks": 200704, 00:27:27.632 "uuid": "25c3befd-e9ac-5f0a-8619-965451021bc0", 00:27:27.632 "assigned_rate_limits": { 00:27:27.632 "rw_ios_per_sec": 0, 00:27:27.632 "rw_mbytes_per_sec": 0, 00:27:27.632 "r_mbytes_per_sec": 0, 00:27:27.632 "w_mbytes_per_sec": 0 00:27:27.632 }, 00:27:27.632 "claimed": false, 00:27:27.632 "zoned": false, 00:27:27.632 "supported_io_types": { 00:27:27.632 "read": true, 00:27:27.632 "write": true, 00:27:27.632 "unmap": false, 00:27:27.632 "flush": false, 00:27:27.632 "reset": false, 00:27:27.632 "nvme_admin": false, 00:27:27.632 "nvme_io": false, 00:27:27.632 "nvme_io_md": false, 00:27:27.632 "write_zeroes": true, 00:27:27.632 "zcopy": false, 00:27:27.632 "get_zone_info": false, 00:27:27.632 "zone_management": false, 00:27:27.632 "zone_append": false, 00:27:27.632 "compare": false, 00:27:27.632 "compare_and_write": false, 00:27:27.632 "abort": false, 00:27:27.632 "seek_hole": false, 00:27:27.632 "seek_data": false, 00:27:27.632 "copy": false, 00:27:27.632 "nvme_iov_md": false 00:27:27.632 }, 00:27:27.632 "driver_specific": { 00:27:27.632 "compress": { 00:27:27.632 "name": "COMP_lvs0/lv0", 00:27:27.632 "base_bdev_name": "3b750381-1ef2-4a86-bac9-a9d7f0633be0" 00:27:27.632 } 00:27:27.632 } 00:27:27.632 } 00:27:27.632 ] 00:27:27.632 17:39:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:27.632 17:39:38 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:27.892 Running I/O for 3 seconds... 00:27:31.188 00:27:31.188 Latency(us) 00:27:31.188 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:31.188 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:31.188 Verification LBA range: start 0x0 length 0x3100 00:27:31.188 COMP_lvs0/lv0 : 3.02 1095.54 4.28 0.00 0.00 29110.44 557.69 30449.03 00:27:31.188 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:31.188 Verification LBA range: start 0x3100 length 0x3100 00:27:31.188 COMP_lvs0/lv0 : 3.02 1104.54 4.31 0.00 0.00 28788.99 214.25 29037.49 00:27:31.188 =================================================================================================================== 00:27:31.188 Total : 2200.08 8.59 0.00 0.00 28948.97 214.25 30449.03 00:27:31.188 0 00:27:31.188 17:39:42 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:31.188 17:39:42 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:31.188 17:39:42 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:31.449 17:39:42 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:31.449 17:39:42 compress_isal -- compress/compress.sh@78 -- # killprocess 2933160 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2933160 ']' 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2933160 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@953 -- # uname 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2933160 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2933160' 00:27:31.449 killing process with pid 2933160 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@967 -- # kill 2933160 00:27:31.449 Received shutdown signal, test time was about 3.000000 seconds 00:27:31.449 00:27:31.449 Latency(us) 00:27:31.449 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:31.449 =================================================================================================================== 00:27:31.449 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:31.449 17:39:42 compress_isal -- common/autotest_common.sh@972 -- # wait 2933160 00:27:34.027 17:39:45 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:27:34.027 17:39:45 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:34.027 17:39:45 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2935612 00:27:34.027 17:39:45 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:34.027 17:39:45 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2935612 00:27:34.027 17:39:45 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:34.027 17:39:45 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2935612 ']' 00:27:34.027 17:39:45 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:34.027 17:39:45 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:34.027 17:39:45 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:34.027 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:34.027 17:39:45 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:34.028 17:39:45 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:34.028 [2024-07-15 17:39:45.118450] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:27:34.028 [2024-07-15 17:39:45.118597] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2935612 ] 00:27:34.028 [2024-07-15 17:39:45.253984] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:34.288 [2024-07-15 17:39:45.354680] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:34.288 [2024-07-15 17:39:45.354687] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:34.858 17:39:45 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:34.858 17:39:45 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:27:34.858 17:39:45 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:27:34.858 17:39:45 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:34.858 17:39:45 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:38.158 17:39:49 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:38.158 17:39:49 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:38.158 17:39:49 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:38.158 17:39:49 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:38.158 17:39:49 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:38.158 17:39:49 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:38.158 17:39:49 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:38.158 17:39:49 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:38.158 [ 00:27:38.158 { 00:27:38.158 "name": "Nvme0n1", 00:27:38.158 "aliases": [ 00:27:38.158 "6992dd3f-6bdc-4fc9-bc52-df1d3f1b8898" 00:27:38.158 ], 00:27:38.158 "product_name": "NVMe disk", 00:27:38.158 "block_size": 512, 00:27:38.158 "num_blocks": 3907029168, 00:27:38.158 "uuid": "6992dd3f-6bdc-4fc9-bc52-df1d3f1b8898", 00:27:38.158 "assigned_rate_limits": { 00:27:38.158 "rw_ios_per_sec": 0, 00:27:38.158 "rw_mbytes_per_sec": 0, 00:27:38.158 "r_mbytes_per_sec": 0, 00:27:38.158 "w_mbytes_per_sec": 0 00:27:38.158 }, 00:27:38.158 "claimed": false, 00:27:38.158 "zoned": false, 00:27:38.158 "supported_io_types": { 00:27:38.158 "read": true, 00:27:38.158 "write": true, 00:27:38.158 "unmap": true, 00:27:38.158 "flush": true, 00:27:38.158 "reset": true, 00:27:38.158 "nvme_admin": true, 00:27:38.158 "nvme_io": true, 00:27:38.158 "nvme_io_md": false, 00:27:38.158 "write_zeroes": true, 00:27:38.158 "zcopy": false, 00:27:38.158 "get_zone_info": false, 00:27:38.158 "zone_management": false, 00:27:38.158 "zone_append": false, 00:27:38.158 "compare": false, 00:27:38.158 "compare_and_write": false, 00:27:38.158 "abort": true, 00:27:38.158 "seek_hole": false, 00:27:38.158 "seek_data": false, 00:27:38.158 "copy": false, 00:27:38.158 "nvme_iov_md": false 00:27:38.158 }, 00:27:38.158 "driver_specific": { 00:27:38.158 "nvme": [ 00:27:38.158 { 00:27:38.158 "pci_address": "0000:65:00.0", 00:27:38.158 "trid": { 00:27:38.158 "trtype": "PCIe", 00:27:38.158 "traddr": "0000:65:00.0" 00:27:38.158 }, 00:27:38.158 "ctrlr_data": { 00:27:38.158 "cntlid": 0, 00:27:38.158 "vendor_id": "0x8086", 00:27:38.158 "model_number": "INTEL SSDPE2KX020T8", 00:27:38.158 "serial_number": "PHLJ9512038S2P0BGN", 00:27:38.158 "firmware_revision": "VDV10184", 00:27:38.158 "oacs": { 00:27:38.158 "security": 0, 00:27:38.158 "format": 1, 00:27:38.158 "firmware": 1, 00:27:38.158 "ns_manage": 1 00:27:38.158 }, 00:27:38.158 "multi_ctrlr": false, 00:27:38.158 "ana_reporting": false 00:27:38.158 }, 00:27:38.158 "vs": { 00:27:38.158 "nvme_version": "1.2" 00:27:38.158 }, 00:27:38.158 "ns_data": { 00:27:38.158 "id": 1, 00:27:38.158 "can_share": false 00:27:38.158 } 00:27:38.158 } 00:27:38.158 ], 00:27:38.158 "mp_policy": "active_passive" 00:27:38.158 } 00:27:38.158 } 00:27:38.158 ] 00:27:38.158 17:39:49 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:38.158 17:39:49 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:39.540 2dabbbaf-20c3-4fd6-a009-26c65082ae06 00:27:39.540 17:39:50 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:39.800 b0bd7cae-e3ec-4437-b8a6-2fde84fd1a8d 00:27:39.800 17:39:50 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:39.800 17:39:50 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:27:39.800 17:39:50 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:39.800 17:39:50 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:39.800 17:39:50 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:39.800 17:39:50 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:39.800 17:39:50 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:39.800 17:39:51 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:40.059 [ 00:27:40.059 { 00:27:40.059 "name": "b0bd7cae-e3ec-4437-b8a6-2fde84fd1a8d", 00:27:40.059 "aliases": [ 00:27:40.059 "lvs0/lv0" 00:27:40.059 ], 00:27:40.059 "product_name": "Logical Volume", 00:27:40.059 "block_size": 512, 00:27:40.059 "num_blocks": 204800, 00:27:40.059 "uuid": "b0bd7cae-e3ec-4437-b8a6-2fde84fd1a8d", 00:27:40.059 "assigned_rate_limits": { 00:27:40.059 "rw_ios_per_sec": 0, 00:27:40.059 "rw_mbytes_per_sec": 0, 00:27:40.059 "r_mbytes_per_sec": 0, 00:27:40.059 "w_mbytes_per_sec": 0 00:27:40.059 }, 00:27:40.059 "claimed": false, 00:27:40.059 "zoned": false, 00:27:40.059 "supported_io_types": { 00:27:40.059 "read": true, 00:27:40.059 "write": true, 00:27:40.059 "unmap": true, 00:27:40.059 "flush": false, 00:27:40.059 "reset": true, 00:27:40.059 "nvme_admin": false, 00:27:40.059 "nvme_io": false, 00:27:40.059 "nvme_io_md": false, 00:27:40.059 "write_zeroes": true, 00:27:40.059 "zcopy": false, 00:27:40.059 "get_zone_info": false, 00:27:40.059 "zone_management": false, 00:27:40.059 "zone_append": false, 00:27:40.059 "compare": false, 00:27:40.059 "compare_and_write": false, 00:27:40.059 "abort": false, 00:27:40.059 "seek_hole": true, 00:27:40.059 "seek_data": true, 00:27:40.059 "copy": false, 00:27:40.059 "nvme_iov_md": false 00:27:40.059 }, 00:27:40.059 "driver_specific": { 00:27:40.059 "lvol": { 00:27:40.059 "lvol_store_uuid": "2dabbbaf-20c3-4fd6-a009-26c65082ae06", 00:27:40.059 "base_bdev": "Nvme0n1", 00:27:40.059 "thin_provision": true, 00:27:40.059 "num_allocated_clusters": 0, 00:27:40.059 "snapshot": false, 00:27:40.059 "clone": false, 00:27:40.059 "esnap_clone": false 00:27:40.059 } 00:27:40.059 } 00:27:40.059 } 00:27:40.059 ] 00:27:40.059 17:39:51 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:40.059 17:39:51 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:27:40.060 17:39:51 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:27:40.319 [2024-07-15 17:39:51.464126] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:40.319 COMP_lvs0/lv0 00:27:40.319 17:39:51 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:40.319 17:39:51 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:27:40.319 17:39:51 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:40.319 17:39:51 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:40.319 17:39:51 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:40.319 17:39:51 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:40.319 17:39:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:40.579 17:39:51 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:40.579 [ 00:27:40.579 { 00:27:40.579 "name": "COMP_lvs0/lv0", 00:27:40.579 "aliases": [ 00:27:40.579 "10f75ddb-24c4-5444-bc49-45f242c5c88e" 00:27:40.579 ], 00:27:40.579 "product_name": "compress", 00:27:40.579 "block_size": 512, 00:27:40.579 "num_blocks": 200704, 00:27:40.579 "uuid": "10f75ddb-24c4-5444-bc49-45f242c5c88e", 00:27:40.579 "assigned_rate_limits": { 00:27:40.579 "rw_ios_per_sec": 0, 00:27:40.579 "rw_mbytes_per_sec": 0, 00:27:40.579 "r_mbytes_per_sec": 0, 00:27:40.579 "w_mbytes_per_sec": 0 00:27:40.579 }, 00:27:40.579 "claimed": false, 00:27:40.579 "zoned": false, 00:27:40.579 "supported_io_types": { 00:27:40.579 "read": true, 00:27:40.579 "write": true, 00:27:40.579 "unmap": false, 00:27:40.579 "flush": false, 00:27:40.579 "reset": false, 00:27:40.579 "nvme_admin": false, 00:27:40.579 "nvme_io": false, 00:27:40.579 "nvme_io_md": false, 00:27:40.579 "write_zeroes": true, 00:27:40.579 "zcopy": false, 00:27:40.579 "get_zone_info": false, 00:27:40.579 "zone_management": false, 00:27:40.579 "zone_append": false, 00:27:40.579 "compare": false, 00:27:40.579 "compare_and_write": false, 00:27:40.579 "abort": false, 00:27:40.579 "seek_hole": false, 00:27:40.579 "seek_data": false, 00:27:40.579 "copy": false, 00:27:40.579 "nvme_iov_md": false 00:27:40.579 }, 00:27:40.579 "driver_specific": { 00:27:40.579 "compress": { 00:27:40.579 "name": "COMP_lvs0/lv0", 00:27:40.579 "base_bdev_name": "b0bd7cae-e3ec-4437-b8a6-2fde84fd1a8d" 00:27:40.579 } 00:27:40.579 } 00:27:40.579 } 00:27:40.579 ] 00:27:40.579 17:39:51 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:40.579 17:39:51 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:40.839 Running I/O for 3 seconds... 00:27:44.172 00:27:44.172 Latency(us) 00:27:44.172 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:44.172 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:44.172 Verification LBA range: start 0x0 length 0x3100 00:27:44.172 COMP_lvs0/lv0 : 3.02 1092.22 4.27 0.00 0.00 29151.72 267.82 29642.44 00:27:44.172 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:44.172 Verification LBA range: start 0x3100 length 0x3100 00:27:44.172 COMP_lvs0/lv0 : 3.02 1102.44 4.31 0.00 0.00 28850.39 989.34 29844.09 00:27:44.172 =================================================================================================================== 00:27:44.172 Total : 2194.66 8.57 0.00 0.00 29000.30 267.82 29844.09 00:27:44.172 0 00:27:44.172 17:39:55 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:44.172 17:39:55 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:44.172 17:39:55 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:44.762 17:39:55 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:44.762 17:39:55 compress_isal -- compress/compress.sh@78 -- # killprocess 2935612 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2935612 ']' 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2935612 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@953 -- # uname 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2935612 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2935612' 00:27:44.762 killing process with pid 2935612 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@967 -- # kill 2935612 00:27:44.762 Received shutdown signal, test time was about 3.000000 seconds 00:27:44.762 00:27:44.762 Latency(us) 00:27:44.762 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:44.762 =================================================================================================================== 00:27:44.762 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:44.762 17:39:55 compress_isal -- common/autotest_common.sh@972 -- # wait 2935612 00:27:47.306 17:39:58 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:27:47.306 17:39:58 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:47.306 17:39:58 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2937610 00:27:47.306 17:39:58 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:47.306 17:39:58 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2937610 00:27:47.306 17:39:58 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:47.306 17:39:58 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2937610 ']' 00:27:47.306 17:39:58 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:47.306 17:39:58 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:47.306 17:39:58 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:47.306 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:47.306 17:39:58 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:47.306 17:39:58 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:47.306 [2024-07-15 17:39:58.278737] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:27:47.306 [2024-07-15 17:39:58.278879] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2937610 ] 00:27:47.306 [2024-07-15 17:39:58.413862] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:47.306 [2024-07-15 17:39:58.515917] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:47.306 [2024-07-15 17:39:58.516054] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:47.877 17:39:59 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:47.877 17:39:59 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:27:47.877 17:39:59 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:27:47.877 17:39:59 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:47.877 17:39:59 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:51.176 17:40:02 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:51.176 17:40:02 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:51.176 17:40:02 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:51.176 17:40:02 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:51.176 17:40:02 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:51.176 17:40:02 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:51.176 17:40:02 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:51.176 17:40:02 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:51.436 [ 00:27:51.436 { 00:27:51.436 "name": "Nvme0n1", 00:27:51.436 "aliases": [ 00:27:51.436 "5e61cf90-09a6-4803-b533-036f59944e05" 00:27:51.436 ], 00:27:51.436 "product_name": "NVMe disk", 00:27:51.436 "block_size": 512, 00:27:51.436 "num_blocks": 3907029168, 00:27:51.436 "uuid": "5e61cf90-09a6-4803-b533-036f59944e05", 00:27:51.436 "assigned_rate_limits": { 00:27:51.436 "rw_ios_per_sec": 0, 00:27:51.436 "rw_mbytes_per_sec": 0, 00:27:51.436 "r_mbytes_per_sec": 0, 00:27:51.436 "w_mbytes_per_sec": 0 00:27:51.436 }, 00:27:51.436 "claimed": false, 00:27:51.436 "zoned": false, 00:27:51.436 "supported_io_types": { 00:27:51.436 "read": true, 00:27:51.436 "write": true, 00:27:51.436 "unmap": true, 00:27:51.436 "flush": true, 00:27:51.436 "reset": true, 00:27:51.436 "nvme_admin": true, 00:27:51.436 "nvme_io": true, 00:27:51.436 "nvme_io_md": false, 00:27:51.436 "write_zeroes": true, 00:27:51.436 "zcopy": false, 00:27:51.436 "get_zone_info": false, 00:27:51.436 "zone_management": false, 00:27:51.436 "zone_append": false, 00:27:51.436 "compare": false, 00:27:51.436 "compare_and_write": false, 00:27:51.436 "abort": true, 00:27:51.436 "seek_hole": false, 00:27:51.436 "seek_data": false, 00:27:51.436 "copy": false, 00:27:51.436 "nvme_iov_md": false 00:27:51.436 }, 00:27:51.436 "driver_specific": { 00:27:51.436 "nvme": [ 00:27:51.436 { 00:27:51.436 "pci_address": "0000:65:00.0", 00:27:51.436 "trid": { 00:27:51.436 "trtype": "PCIe", 00:27:51.436 "traddr": "0000:65:00.0" 00:27:51.436 }, 00:27:51.436 "ctrlr_data": { 00:27:51.436 "cntlid": 0, 00:27:51.436 "vendor_id": "0x8086", 00:27:51.436 "model_number": "INTEL SSDPE2KX020T8", 00:27:51.436 "serial_number": "PHLJ9512038S2P0BGN", 00:27:51.436 "firmware_revision": "VDV10184", 00:27:51.436 "oacs": { 00:27:51.436 "security": 0, 00:27:51.436 "format": 1, 00:27:51.436 "firmware": 1, 00:27:51.436 "ns_manage": 1 00:27:51.436 }, 00:27:51.436 "multi_ctrlr": false, 00:27:51.436 "ana_reporting": false 00:27:51.436 }, 00:27:51.436 "vs": { 00:27:51.436 "nvme_version": "1.2" 00:27:51.436 }, 00:27:51.436 "ns_data": { 00:27:51.436 "id": 1, 00:27:51.436 "can_share": false 00:27:51.436 } 00:27:51.436 } 00:27:51.436 ], 00:27:51.436 "mp_policy": "active_passive" 00:27:51.436 } 00:27:51.436 } 00:27:51.436 ] 00:27:51.436 17:40:02 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:51.436 17:40:02 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:52.817 d34821c5-ea0a-42d2-b4c6-77e541243ec1 00:27:52.817 17:40:03 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:52.817 a5fadf89-0ffc-44b1-a313-2e40fd3fcb2d 00:27:52.817 17:40:04 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:52.817 17:40:04 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:27:52.817 17:40:04 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:52.817 17:40:04 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:52.817 17:40:04 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:52.817 17:40:04 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:52.817 17:40:04 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:53.077 17:40:04 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:53.336 [ 00:27:53.336 { 00:27:53.336 "name": "a5fadf89-0ffc-44b1-a313-2e40fd3fcb2d", 00:27:53.336 "aliases": [ 00:27:53.336 "lvs0/lv0" 00:27:53.337 ], 00:27:53.337 "product_name": "Logical Volume", 00:27:53.337 "block_size": 512, 00:27:53.337 "num_blocks": 204800, 00:27:53.337 "uuid": "a5fadf89-0ffc-44b1-a313-2e40fd3fcb2d", 00:27:53.337 "assigned_rate_limits": { 00:27:53.337 "rw_ios_per_sec": 0, 00:27:53.337 "rw_mbytes_per_sec": 0, 00:27:53.337 "r_mbytes_per_sec": 0, 00:27:53.337 "w_mbytes_per_sec": 0 00:27:53.337 }, 00:27:53.337 "claimed": false, 00:27:53.337 "zoned": false, 00:27:53.337 "supported_io_types": { 00:27:53.337 "read": true, 00:27:53.337 "write": true, 00:27:53.337 "unmap": true, 00:27:53.337 "flush": false, 00:27:53.337 "reset": true, 00:27:53.337 "nvme_admin": false, 00:27:53.337 "nvme_io": false, 00:27:53.337 "nvme_io_md": false, 00:27:53.337 "write_zeroes": true, 00:27:53.337 "zcopy": false, 00:27:53.337 "get_zone_info": false, 00:27:53.337 "zone_management": false, 00:27:53.337 "zone_append": false, 00:27:53.337 "compare": false, 00:27:53.337 "compare_and_write": false, 00:27:53.337 "abort": false, 00:27:53.337 "seek_hole": true, 00:27:53.337 "seek_data": true, 00:27:53.337 "copy": false, 00:27:53.337 "nvme_iov_md": false 00:27:53.337 }, 00:27:53.337 "driver_specific": { 00:27:53.337 "lvol": { 00:27:53.337 "lvol_store_uuid": "d34821c5-ea0a-42d2-b4c6-77e541243ec1", 00:27:53.337 "base_bdev": "Nvme0n1", 00:27:53.337 "thin_provision": true, 00:27:53.337 "num_allocated_clusters": 0, 00:27:53.337 "snapshot": false, 00:27:53.337 "clone": false, 00:27:53.337 "esnap_clone": false 00:27:53.337 } 00:27:53.337 } 00:27:53.337 } 00:27:53.337 ] 00:27:53.337 17:40:04 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:53.337 17:40:04 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:27:53.337 17:40:04 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:27:53.597 [2024-07-15 17:40:04.713162] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:53.597 COMP_lvs0/lv0 00:27:53.597 17:40:04 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:53.597 17:40:04 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:27:53.597 17:40:04 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:53.597 17:40:04 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:53.597 17:40:04 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:53.597 17:40:04 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:53.597 17:40:04 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:53.859 17:40:04 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:53.859 [ 00:27:53.859 { 00:27:53.859 "name": "COMP_lvs0/lv0", 00:27:53.859 "aliases": [ 00:27:53.859 "1e63d3bf-a794-53fa-83af-a42c5c5c75d4" 00:27:53.859 ], 00:27:53.859 "product_name": "compress", 00:27:53.859 "block_size": 4096, 00:27:53.859 "num_blocks": 25088, 00:27:53.859 "uuid": "1e63d3bf-a794-53fa-83af-a42c5c5c75d4", 00:27:53.859 "assigned_rate_limits": { 00:27:53.859 "rw_ios_per_sec": 0, 00:27:53.859 "rw_mbytes_per_sec": 0, 00:27:53.859 "r_mbytes_per_sec": 0, 00:27:53.859 "w_mbytes_per_sec": 0 00:27:53.859 }, 00:27:53.859 "claimed": false, 00:27:53.859 "zoned": false, 00:27:53.859 "supported_io_types": { 00:27:53.859 "read": true, 00:27:53.859 "write": true, 00:27:53.859 "unmap": false, 00:27:53.859 "flush": false, 00:27:53.859 "reset": false, 00:27:53.859 "nvme_admin": false, 00:27:53.859 "nvme_io": false, 00:27:53.859 "nvme_io_md": false, 00:27:53.859 "write_zeroes": true, 00:27:53.859 "zcopy": false, 00:27:53.859 "get_zone_info": false, 00:27:53.859 "zone_management": false, 00:27:53.859 "zone_append": false, 00:27:53.859 "compare": false, 00:27:53.859 "compare_and_write": false, 00:27:53.859 "abort": false, 00:27:53.859 "seek_hole": false, 00:27:53.859 "seek_data": false, 00:27:53.859 "copy": false, 00:27:53.859 "nvme_iov_md": false 00:27:53.859 }, 00:27:53.859 "driver_specific": { 00:27:53.859 "compress": { 00:27:53.859 "name": "COMP_lvs0/lv0", 00:27:53.859 "base_bdev_name": "a5fadf89-0ffc-44b1-a313-2e40fd3fcb2d" 00:27:53.859 } 00:27:53.859 } 00:27:53.859 } 00:27:53.859 ] 00:27:53.859 17:40:05 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:53.859 17:40:05 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:54.120 Running I/O for 3 seconds... 00:27:57.425 00:27:57.425 Latency(us) 00:27:57.425 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:57.425 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:57.425 Verification LBA range: start 0x0 length 0x3100 00:27:57.425 COMP_lvs0/lv0 : 3.02 1132.56 4.42 0.00 0.00 28152.81 181.96 28835.84 00:27:57.425 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:57.425 Verification LBA range: start 0x3100 length 0x3100 00:27:57.425 COMP_lvs0/lv0 : 3.02 1138.49 4.45 0.00 0.00 27945.89 313.50 30247.38 00:27:57.425 =================================================================================================================== 00:27:57.425 Total : 2271.05 8.87 0.00 0.00 28049.08 181.96 30247.38 00:27:57.425 0 00:27:57.425 17:40:08 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:57.425 17:40:08 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:57.425 17:40:08 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:57.686 17:40:08 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:57.686 17:40:08 compress_isal -- compress/compress.sh@78 -- # killprocess 2937610 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2937610 ']' 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2937610 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@953 -- # uname 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2937610 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2937610' 00:27:57.686 killing process with pid 2937610 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@967 -- # kill 2937610 00:27:57.686 Received shutdown signal, test time was about 3.000000 seconds 00:27:57.686 00:27:57.686 Latency(us) 00:27:57.686 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:57.686 =================================================================================================================== 00:27:57.686 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:57.686 17:40:08 compress_isal -- common/autotest_common.sh@972 -- # wait 2937610 00:28:00.235 17:40:11 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:28:00.235 17:40:11 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:28:00.235 17:40:11 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=2939659 00:28:00.235 17:40:11 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:00.235 17:40:11 compress_isal -- compress/compress.sh@57 -- # waitforlisten 2939659 00:28:00.235 17:40:11 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:28:00.235 17:40:11 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2939659 ']' 00:28:00.235 17:40:11 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:00.235 17:40:11 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:00.235 17:40:11 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:00.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:00.235 17:40:11 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:00.235 17:40:11 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:28:00.235 [2024-07-15 17:40:11.275082] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:28:00.235 [2024-07-15 17:40:11.275149] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2939659 ] 00:28:00.235 [2024-07-15 17:40:11.365650] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:00.235 [2024-07-15 17:40:11.462650] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:00.235 [2024-07-15 17:40:11.462809] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:00.235 [2024-07-15 17:40:11.462839] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:01.178 17:40:12 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:01.178 17:40:12 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:28:01.178 17:40:12 compress_isal -- compress/compress.sh@58 -- # create_vols 00:28:01.178 17:40:12 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:01.178 17:40:12 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:04.476 17:40:15 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:04.476 17:40:15 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:28:04.476 17:40:15 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:04.476 17:40:15 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:04.476 17:40:15 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:04.476 17:40:15 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:04.476 17:40:15 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:04.476 17:40:15 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:04.476 [ 00:28:04.476 { 00:28:04.476 "name": "Nvme0n1", 00:28:04.476 "aliases": [ 00:28:04.476 "2ed88d3a-19f2-433a-879c-7054050c4449" 00:28:04.476 ], 00:28:04.476 "product_name": "NVMe disk", 00:28:04.476 "block_size": 512, 00:28:04.476 "num_blocks": 3907029168, 00:28:04.476 "uuid": "2ed88d3a-19f2-433a-879c-7054050c4449", 00:28:04.476 "assigned_rate_limits": { 00:28:04.476 "rw_ios_per_sec": 0, 00:28:04.476 "rw_mbytes_per_sec": 0, 00:28:04.476 "r_mbytes_per_sec": 0, 00:28:04.476 "w_mbytes_per_sec": 0 00:28:04.476 }, 00:28:04.476 "claimed": false, 00:28:04.476 "zoned": false, 00:28:04.476 "supported_io_types": { 00:28:04.476 "read": true, 00:28:04.476 "write": true, 00:28:04.476 "unmap": true, 00:28:04.476 "flush": true, 00:28:04.476 "reset": true, 00:28:04.476 "nvme_admin": true, 00:28:04.476 "nvme_io": true, 00:28:04.476 "nvme_io_md": false, 00:28:04.476 "write_zeroes": true, 00:28:04.476 "zcopy": false, 00:28:04.476 "get_zone_info": false, 00:28:04.476 "zone_management": false, 00:28:04.476 "zone_append": false, 00:28:04.476 "compare": false, 00:28:04.476 "compare_and_write": false, 00:28:04.476 "abort": true, 00:28:04.476 "seek_hole": false, 00:28:04.476 "seek_data": false, 00:28:04.476 "copy": false, 00:28:04.476 "nvme_iov_md": false 00:28:04.476 }, 00:28:04.476 "driver_specific": { 00:28:04.476 "nvme": [ 00:28:04.476 { 00:28:04.476 "pci_address": "0000:65:00.0", 00:28:04.476 "trid": { 00:28:04.476 "trtype": "PCIe", 00:28:04.476 "traddr": "0000:65:00.0" 00:28:04.476 }, 00:28:04.476 "ctrlr_data": { 00:28:04.476 "cntlid": 0, 00:28:04.476 "vendor_id": "0x8086", 00:28:04.476 "model_number": "INTEL SSDPE2KX020T8", 00:28:04.476 "serial_number": "PHLJ9512038S2P0BGN", 00:28:04.476 "firmware_revision": "VDV10184", 00:28:04.476 "oacs": { 00:28:04.476 "security": 0, 00:28:04.476 "format": 1, 00:28:04.476 "firmware": 1, 00:28:04.476 "ns_manage": 1 00:28:04.476 }, 00:28:04.476 "multi_ctrlr": false, 00:28:04.476 "ana_reporting": false 00:28:04.476 }, 00:28:04.476 "vs": { 00:28:04.476 "nvme_version": "1.2" 00:28:04.476 }, 00:28:04.476 "ns_data": { 00:28:04.476 "id": 1, 00:28:04.476 "can_share": false 00:28:04.476 } 00:28:04.476 } 00:28:04.476 ], 00:28:04.476 "mp_policy": "active_passive" 00:28:04.476 } 00:28:04.476 } 00:28:04.476 ] 00:28:04.476 17:40:15 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:04.476 17:40:15 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:05.860 80da4375-8409-4485-bb59-5be8d4c3665a 00:28:05.860 17:40:16 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:05.860 8673cea0-d870-4071-a399-9d459818b630 00:28:05.860 17:40:17 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:05.860 17:40:17 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:28:05.860 17:40:17 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:05.860 17:40:17 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:05.860 17:40:17 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:05.860 17:40:17 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:05.860 17:40:17 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:06.120 17:40:17 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:06.380 [ 00:28:06.380 { 00:28:06.380 "name": "8673cea0-d870-4071-a399-9d459818b630", 00:28:06.380 "aliases": [ 00:28:06.380 "lvs0/lv0" 00:28:06.380 ], 00:28:06.380 "product_name": "Logical Volume", 00:28:06.380 "block_size": 512, 00:28:06.380 "num_blocks": 204800, 00:28:06.380 "uuid": "8673cea0-d870-4071-a399-9d459818b630", 00:28:06.380 "assigned_rate_limits": { 00:28:06.380 "rw_ios_per_sec": 0, 00:28:06.380 "rw_mbytes_per_sec": 0, 00:28:06.380 "r_mbytes_per_sec": 0, 00:28:06.380 "w_mbytes_per_sec": 0 00:28:06.380 }, 00:28:06.380 "claimed": false, 00:28:06.380 "zoned": false, 00:28:06.380 "supported_io_types": { 00:28:06.380 "read": true, 00:28:06.380 "write": true, 00:28:06.380 "unmap": true, 00:28:06.380 "flush": false, 00:28:06.380 "reset": true, 00:28:06.380 "nvme_admin": false, 00:28:06.380 "nvme_io": false, 00:28:06.380 "nvme_io_md": false, 00:28:06.380 "write_zeroes": true, 00:28:06.380 "zcopy": false, 00:28:06.380 "get_zone_info": false, 00:28:06.380 "zone_management": false, 00:28:06.380 "zone_append": false, 00:28:06.380 "compare": false, 00:28:06.380 "compare_and_write": false, 00:28:06.380 "abort": false, 00:28:06.380 "seek_hole": true, 00:28:06.380 "seek_data": true, 00:28:06.380 "copy": false, 00:28:06.380 "nvme_iov_md": false 00:28:06.380 }, 00:28:06.380 "driver_specific": { 00:28:06.380 "lvol": { 00:28:06.380 "lvol_store_uuid": "80da4375-8409-4485-bb59-5be8d4c3665a", 00:28:06.380 "base_bdev": "Nvme0n1", 00:28:06.380 "thin_provision": true, 00:28:06.380 "num_allocated_clusters": 0, 00:28:06.380 "snapshot": false, 00:28:06.380 "clone": false, 00:28:06.380 "esnap_clone": false 00:28:06.380 } 00:28:06.380 } 00:28:06.380 } 00:28:06.380 ] 00:28:06.380 17:40:17 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:06.380 17:40:17 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:28:06.380 17:40:17 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:28:06.380 [2024-07-15 17:40:17.590702] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:06.380 COMP_lvs0/lv0 00:28:06.380 17:40:17 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:06.380 17:40:17 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:28:06.380 17:40:17 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:06.380 17:40:17 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:06.380 17:40:17 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:06.380 17:40:17 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:06.380 17:40:17 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:06.640 17:40:17 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:06.899 [ 00:28:06.899 { 00:28:06.899 "name": "COMP_lvs0/lv0", 00:28:06.899 "aliases": [ 00:28:06.899 "94590a30-ac3e-5006-9601-5d82fcbaaca7" 00:28:06.899 ], 00:28:06.899 "product_name": "compress", 00:28:06.899 "block_size": 512, 00:28:06.899 "num_blocks": 200704, 00:28:06.899 "uuid": "94590a30-ac3e-5006-9601-5d82fcbaaca7", 00:28:06.899 "assigned_rate_limits": { 00:28:06.899 "rw_ios_per_sec": 0, 00:28:06.899 "rw_mbytes_per_sec": 0, 00:28:06.899 "r_mbytes_per_sec": 0, 00:28:06.899 "w_mbytes_per_sec": 0 00:28:06.899 }, 00:28:06.899 "claimed": false, 00:28:06.899 "zoned": false, 00:28:06.899 "supported_io_types": { 00:28:06.899 "read": true, 00:28:06.899 "write": true, 00:28:06.899 "unmap": false, 00:28:06.899 "flush": false, 00:28:06.899 "reset": false, 00:28:06.899 "nvme_admin": false, 00:28:06.899 "nvme_io": false, 00:28:06.899 "nvme_io_md": false, 00:28:06.899 "write_zeroes": true, 00:28:06.899 "zcopy": false, 00:28:06.899 "get_zone_info": false, 00:28:06.899 "zone_management": false, 00:28:06.899 "zone_append": false, 00:28:06.899 "compare": false, 00:28:06.899 "compare_and_write": false, 00:28:06.899 "abort": false, 00:28:06.899 "seek_hole": false, 00:28:06.899 "seek_data": false, 00:28:06.899 "copy": false, 00:28:06.899 "nvme_iov_md": false 00:28:06.899 }, 00:28:06.899 "driver_specific": { 00:28:06.899 "compress": { 00:28:06.899 "name": "COMP_lvs0/lv0", 00:28:06.899 "base_bdev_name": "8673cea0-d870-4071-a399-9d459818b630" 00:28:06.899 } 00:28:06.899 } 00:28:06.899 } 00:28:06.899 ] 00:28:06.899 17:40:18 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:06.900 17:40:18 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:06.900 I/O targets: 00:28:06.900 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:28:06.900 00:28:06.900 00:28:06.900 CUnit - A unit testing framework for C - Version 2.1-3 00:28:06.900 http://cunit.sourceforge.net/ 00:28:06.900 00:28:06.900 00:28:06.900 Suite: bdevio tests on: COMP_lvs0/lv0 00:28:06.900 Test: blockdev write read block ...passed 00:28:06.900 Test: blockdev write zeroes read block ...passed 00:28:06.900 Test: blockdev write zeroes read no split ...passed 00:28:07.159 Test: blockdev write zeroes read split ...passed 00:28:07.159 Test: blockdev write zeroes read split partial ...passed 00:28:07.160 Test: blockdev reset ...[2024-07-15 17:40:18.270139] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:28:07.160 passed 00:28:07.160 Test: blockdev write read 8 blocks ...passed 00:28:07.160 Test: blockdev write read size > 128k ...passed 00:28:07.160 Test: blockdev write read invalid size ...passed 00:28:07.160 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:07.160 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:07.160 Test: blockdev write read max offset ...passed 00:28:07.160 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:07.160 Test: blockdev writev readv 8 blocks ...passed 00:28:07.160 Test: blockdev writev readv 30 x 1block ...passed 00:28:07.160 Test: blockdev writev readv block ...passed 00:28:07.160 Test: blockdev writev readv size > 128k ...passed 00:28:07.160 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:07.160 Test: blockdev comparev and writev ...passed 00:28:07.160 Test: blockdev nvme passthru rw ...passed 00:28:07.160 Test: blockdev nvme passthru vendor specific ...passed 00:28:07.160 Test: blockdev nvme admin passthru ...passed 00:28:07.160 Test: blockdev copy ...passed 00:28:07.160 00:28:07.160 Run Summary: Type Total Ran Passed Failed Inactive 00:28:07.160 suites 1 1 n/a 0 0 00:28:07.160 tests 23 23 23 0 0 00:28:07.160 asserts 130 130 130 0 n/a 00:28:07.160 00:28:07.160 Elapsed time = 0.405 seconds 00:28:07.160 0 00:28:07.160 17:40:18 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:28:07.160 17:40:18 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:07.463 17:40:18 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:07.463 17:40:18 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:28:07.463 17:40:18 compress_isal -- compress/compress.sh@62 -- # killprocess 2939659 00:28:07.463 17:40:18 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2939659 ']' 00:28:07.463 17:40:18 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2939659 00:28:07.463 17:40:18 compress_isal -- common/autotest_common.sh@953 -- # uname 00:28:07.463 17:40:18 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:07.463 17:40:18 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2939659 00:28:07.723 17:40:18 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:07.723 17:40:18 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:07.723 17:40:18 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2939659' 00:28:07.723 killing process with pid 2939659 00:28:07.723 17:40:18 compress_isal -- common/autotest_common.sh@967 -- # kill 2939659 00:28:07.723 17:40:18 compress_isal -- common/autotest_common.sh@972 -- # wait 2939659 00:28:10.264 17:40:21 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:28:10.264 17:40:21 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:28:10.264 00:28:10.264 real 0m50.815s 00:28:10.264 user 1m56.549s 00:28:10.264 sys 0m3.632s 00:28:10.264 17:40:21 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:10.264 17:40:21 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:28:10.264 ************************************ 00:28:10.264 END TEST compress_isal 00:28:10.264 ************************************ 00:28:10.264 17:40:21 -- common/autotest_common.sh@1142 -- # return 0 00:28:10.264 17:40:21 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:28:10.264 17:40:21 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:28:10.264 17:40:21 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:28:10.264 17:40:21 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:28:10.264 17:40:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:10.264 17:40:21 -- common/autotest_common.sh@10 -- # set +x 00:28:10.264 ************************************ 00:28:10.264 START TEST blockdev_crypto_aesni 00:28:10.264 ************************************ 00:28:10.264 17:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:28:10.264 * Looking for test storage... 00:28:10.264 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2941494 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 2941494 00:28:10.264 17:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:28:10.264 17:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 2941494 ']' 00:28:10.264 17:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:10.264 17:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:10.264 17:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:10.264 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:10.264 17:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:10.264 17:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:10.264 [2024-07-15 17:40:21.386696] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:28:10.264 [2024-07-15 17:40:21.386770] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2941494 ] 00:28:10.264 [2024-07-15 17:40:21.479119] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:10.264 [2024-07-15 17:40:21.549437] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:11.205 17:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:11.205 17:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:28:11.205 17:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:28:11.205 17:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:28:11.205 17:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:28:11.205 17:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:11.205 17:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:11.205 [2024-07-15 17:40:22.235397] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:11.205 [2024-07-15 17:40:22.243427] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:11.205 [2024-07-15 17:40:22.251444] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:11.205 [2024-07-15 17:40:22.300504] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:13.747 true 00:28:13.747 true 00:28:13.747 true 00:28:13.747 true 00:28:13.747 Malloc0 00:28:13.747 Malloc1 00:28:13.747 Malloc2 00:28:13.747 Malloc3 00:28:13.747 [2024-07-15 17:40:24.574752] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:13.747 crypto_ram 00:28:13.747 [2024-07-15 17:40:24.582768] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:13.747 crypto_ram2 00:28:13.747 [2024-07-15 17:40:24.590786] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:13.747 crypto_ram3 00:28:13.747 [2024-07-15 17:40:24.598805] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:13.747 crypto_ram4 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:13.747 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:28:13.747 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:28:13.748 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4ccd353f-901d-55a8-9757-eae9412c9b0f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4ccd353f-901d-55a8-9757-eae9412c9b0f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "e75eadcb-bbb4-562a-be3a-df290740e004"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e75eadcb-bbb4-562a-be3a-df290740e004",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f559dfe7-89b0-5d7d-91f9-459fd16c3cb2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f559dfe7-89b0-5d7d-91f9-459fd16c3cb2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "13c076e8-efc3-503d-81ff-470dde760cc9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "13c076e8-efc3-503d-81ff-470dde760cc9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:28:13.748 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:28:13.748 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:28:13.748 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:28:13.748 17:40:24 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 2941494 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 2941494 ']' 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 2941494 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2941494 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2941494' 00:28:13.748 killing process with pid 2941494 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 2941494 00:28:13.748 17:40:24 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 2941494 00:28:14.009 17:40:25 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:14.009 17:40:25 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:14.009 17:40:25 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:14.009 17:40:25 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:14.009 17:40:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:14.009 ************************************ 00:28:14.009 START TEST bdev_hello_world 00:28:14.009 ************************************ 00:28:14.009 17:40:25 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:14.009 [2024-07-15 17:40:25.261123] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:28:14.009 [2024-07-15 17:40:25.261172] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2942169 ] 00:28:14.269 [2024-07-15 17:40:25.347349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:14.269 [2024-07-15 17:40:25.410948] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:14.269 [2024-07-15 17:40:25.431944] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:14.269 [2024-07-15 17:40:25.439968] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:14.269 [2024-07-15 17:40:25.447985] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:14.269 [2024-07-15 17:40:25.535517] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:16.810 [2024-07-15 17:40:27.699642] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:16.810 [2024-07-15 17:40:27.699698] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:16.810 [2024-07-15 17:40:27.699706] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:16.810 [2024-07-15 17:40:27.707659] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:16.810 [2024-07-15 17:40:27.707670] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:16.810 [2024-07-15 17:40:27.707676] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:16.810 [2024-07-15 17:40:27.715679] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:16.810 [2024-07-15 17:40:27.715689] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:16.810 [2024-07-15 17:40:27.715695] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:16.810 [2024-07-15 17:40:27.723701] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:16.810 [2024-07-15 17:40:27.723714] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:16.810 [2024-07-15 17:40:27.723720] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:16.810 [2024-07-15 17:40:27.785074] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:28:16.810 [2024-07-15 17:40:27.785105] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:28:16.810 [2024-07-15 17:40:27.785115] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:28:16.810 [2024-07-15 17:40:27.786151] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:28:16.810 [2024-07-15 17:40:27.786205] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:28:16.810 [2024-07-15 17:40:27.786214] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:28:16.810 [2024-07-15 17:40:27.786246] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:28:16.810 00:28:16.810 [2024-07-15 17:40:27.786256] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:28:16.810 00:28:16.810 real 0m2.808s 00:28:16.810 user 0m2.547s 00:28:16.810 sys 0m0.228s 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:28:16.810 ************************************ 00:28:16.810 END TEST bdev_hello_world 00:28:16.810 ************************************ 00:28:16.810 17:40:28 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:28:16.810 17:40:28 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:28:16.810 17:40:28 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:28:16.810 17:40:28 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:16.810 17:40:28 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:16.810 ************************************ 00:28:16.810 START TEST bdev_bounds 00:28:16.810 ************************************ 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2942521 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2942521' 00:28:16.810 Process bdevio pid: 2942521 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2942521 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2942521 ']' 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:16.810 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:16.810 17:40:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:17.071 [2024-07-15 17:40:28.146298] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:28:17.071 [2024-07-15 17:40:28.146348] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2942521 ] 00:28:17.071 [2024-07-15 17:40:28.236534] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:17.071 [2024-07-15 17:40:28.306719] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:17.071 [2024-07-15 17:40:28.306928] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:17.071 [2024-07-15 17:40:28.307022] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:17.071 [2024-07-15 17:40:28.328070] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:17.071 [2024-07-15 17:40:28.336096] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:17.071 [2024-07-15 17:40:28.344117] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:17.331 [2024-07-15 17:40:28.430450] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:19.877 [2024-07-15 17:40:30.595253] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:19.877 [2024-07-15 17:40:30.595311] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:19.877 [2024-07-15 17:40:30.595320] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:19.877 [2024-07-15 17:40:30.603272] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:19.877 [2024-07-15 17:40:30.603283] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:19.877 [2024-07-15 17:40:30.603288] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:19.877 [2024-07-15 17:40:30.611293] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:19.877 [2024-07-15 17:40:30.611303] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:19.877 [2024-07-15 17:40:30.611308] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:19.877 [2024-07-15 17:40:30.619314] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:19.877 [2024-07-15 17:40:30.619324] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:19.877 [2024-07-15 17:40:30.619330] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:19.877 17:40:30 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:19.877 17:40:30 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:28:19.877 17:40:30 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:19.877 I/O targets: 00:28:19.877 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:28:19.877 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:28:19.877 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:28:19.877 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:28:19.877 00:28:19.877 00:28:19.877 CUnit - A unit testing framework for C - Version 2.1-3 00:28:19.877 http://cunit.sourceforge.net/ 00:28:19.877 00:28:19.877 00:28:19.877 Suite: bdevio tests on: crypto_ram4 00:28:19.877 Test: blockdev write read block ...passed 00:28:19.877 Test: blockdev write zeroes read block ...passed 00:28:19.877 Test: blockdev write zeroes read no split ...passed 00:28:19.877 Test: blockdev write zeroes read split ...passed 00:28:19.877 Test: blockdev write zeroes read split partial ...passed 00:28:19.877 Test: blockdev reset ...passed 00:28:19.877 Test: blockdev write read 8 blocks ...passed 00:28:19.877 Test: blockdev write read size > 128k ...passed 00:28:19.877 Test: blockdev write read invalid size ...passed 00:28:19.877 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:19.877 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:19.877 Test: blockdev write read max offset ...passed 00:28:19.877 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:19.877 Test: blockdev writev readv 8 blocks ...passed 00:28:19.877 Test: blockdev writev readv 30 x 1block ...passed 00:28:19.877 Test: blockdev writev readv block ...passed 00:28:19.877 Test: blockdev writev readv size > 128k ...passed 00:28:19.877 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:19.877 Test: blockdev comparev and writev ...passed 00:28:19.877 Test: blockdev nvme passthru rw ...passed 00:28:19.877 Test: blockdev nvme passthru vendor specific ...passed 00:28:19.877 Test: blockdev nvme admin passthru ...passed 00:28:19.877 Test: blockdev copy ...passed 00:28:19.877 Suite: bdevio tests on: crypto_ram3 00:28:19.877 Test: blockdev write read block ...passed 00:28:19.877 Test: blockdev write zeroes read block ...passed 00:28:19.877 Test: blockdev write zeroes read no split ...passed 00:28:19.877 Test: blockdev write zeroes read split ...passed 00:28:19.877 Test: blockdev write zeroes read split partial ...passed 00:28:19.877 Test: blockdev reset ...passed 00:28:19.877 Test: blockdev write read 8 blocks ...passed 00:28:19.877 Test: blockdev write read size > 128k ...passed 00:28:19.877 Test: blockdev write read invalid size ...passed 00:28:19.877 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:19.877 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:19.877 Test: blockdev write read max offset ...passed 00:28:19.877 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:19.877 Test: blockdev writev readv 8 blocks ...passed 00:28:19.877 Test: blockdev writev readv 30 x 1block ...passed 00:28:19.877 Test: blockdev writev readv block ...passed 00:28:19.877 Test: blockdev writev readv size > 128k ...passed 00:28:19.877 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:19.877 Test: blockdev comparev and writev ...passed 00:28:19.877 Test: blockdev nvme passthru rw ...passed 00:28:19.877 Test: blockdev nvme passthru vendor specific ...passed 00:28:19.877 Test: blockdev nvme admin passthru ...passed 00:28:19.877 Test: blockdev copy ...passed 00:28:19.877 Suite: bdevio tests on: crypto_ram2 00:28:19.877 Test: blockdev write read block ...passed 00:28:19.877 Test: blockdev write zeroes read block ...passed 00:28:19.877 Test: blockdev write zeroes read no split ...passed 00:28:19.877 Test: blockdev write zeroes read split ...passed 00:28:20.136 Test: blockdev write zeroes read split partial ...passed 00:28:20.136 Test: blockdev reset ...passed 00:28:20.136 Test: blockdev write read 8 blocks ...passed 00:28:20.136 Test: blockdev write read size > 128k ...passed 00:28:20.136 Test: blockdev write read invalid size ...passed 00:28:20.136 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:20.136 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:20.136 Test: blockdev write read max offset ...passed 00:28:20.136 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:20.136 Test: blockdev writev readv 8 blocks ...passed 00:28:20.136 Test: blockdev writev readv 30 x 1block ...passed 00:28:20.136 Test: blockdev writev readv block ...passed 00:28:20.136 Test: blockdev writev readv size > 128k ...passed 00:28:20.136 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:20.136 Test: blockdev comparev and writev ...passed 00:28:20.136 Test: blockdev nvme passthru rw ...passed 00:28:20.136 Test: blockdev nvme passthru vendor specific ...passed 00:28:20.136 Test: blockdev nvme admin passthru ...passed 00:28:20.136 Test: blockdev copy ...passed 00:28:20.136 Suite: bdevio tests on: crypto_ram 00:28:20.136 Test: blockdev write read block ...passed 00:28:20.136 Test: blockdev write zeroes read block ...passed 00:28:20.136 Test: blockdev write zeroes read no split ...passed 00:28:20.396 Test: blockdev write zeroes read split ...passed 00:28:20.657 Test: blockdev write zeroes read split partial ...passed 00:28:20.657 Test: blockdev reset ...passed 00:28:20.657 Test: blockdev write read 8 blocks ...passed 00:28:20.657 Test: blockdev write read size > 128k ...passed 00:28:20.657 Test: blockdev write read invalid size ...passed 00:28:20.657 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:20.657 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:20.657 Test: blockdev write read max offset ...passed 00:28:20.657 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:20.657 Test: blockdev writev readv 8 blocks ...passed 00:28:20.657 Test: blockdev writev readv 30 x 1block ...passed 00:28:20.657 Test: blockdev writev readv block ...passed 00:28:20.657 Test: blockdev writev readv size > 128k ...passed 00:28:20.657 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:20.657 Test: blockdev comparev and writev ...passed 00:28:20.657 Test: blockdev nvme passthru rw ...passed 00:28:20.657 Test: blockdev nvme passthru vendor specific ...passed 00:28:20.657 Test: blockdev nvme admin passthru ...passed 00:28:20.657 Test: blockdev copy ...passed 00:28:20.657 00:28:20.657 Run Summary: Type Total Ran Passed Failed Inactive 00:28:20.657 suites 4 4 n/a 0 0 00:28:20.657 tests 92 92 92 0 0 00:28:20.657 asserts 520 520 520 0 n/a 00:28:20.657 00:28:20.657 Elapsed time = 1.871 seconds 00:28:20.657 0 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2942521 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2942521 ']' 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2942521 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2942521 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2942521' 00:28:20.657 killing process with pid 2942521 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2942521 00:28:20.657 17:40:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2942521 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:28:20.918 00:28:20.918 real 0m3.968s 00:28:20.918 user 0m10.751s 00:28:20.918 sys 0m0.394s 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:20.918 ************************************ 00:28:20.918 END TEST bdev_bounds 00:28:20.918 ************************************ 00:28:20.918 17:40:32 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:28:20.918 17:40:32 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:28:20.918 17:40:32 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:20.918 17:40:32 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:20.918 17:40:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:20.918 ************************************ 00:28:20.918 START TEST bdev_nbd 00:28:20.918 ************************************ 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:28:20.918 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2943182 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2943182 /var/tmp/spdk-nbd.sock 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2943182 ']' 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:28:20.919 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:20.919 17:40:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:20.919 [2024-07-15 17:40:32.195817] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:28:20.919 [2024-07-15 17:40:32.195863] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:21.179 [2024-07-15 17:40:32.284803] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:21.179 [2024-07-15 17:40:32.352048] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:21.179 [2024-07-15 17:40:32.373047] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:21.179 [2024-07-15 17:40:32.381068] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:21.179 [2024-07-15 17:40:32.389086] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:21.438 [2024-07-15 17:40:32.487959] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:23.979 [2024-07-15 17:40:34.664501] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:23.979 [2024-07-15 17:40:34.664553] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:23.979 [2024-07-15 17:40:34.664561] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:23.979 [2024-07-15 17:40:34.672520] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:23.979 [2024-07-15 17:40:34.672532] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:23.979 [2024-07-15 17:40:34.672538] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:23.979 [2024-07-15 17:40:34.680539] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:23.979 [2024-07-15 17:40:34.680549] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:23.979 [2024-07-15 17:40:34.680555] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:23.979 [2024-07-15 17:40:34.688558] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:23.979 [2024-07-15 17:40:34.688568] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:23.979 [2024-07-15 17:40:34.688573] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:23.979 17:40:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:23.979 1+0 records in 00:28:23.979 1+0 records out 00:28:23.979 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264846 s, 15.5 MB/s 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:23.979 1+0 records in 00:28:23.979 1+0 records out 00:28:23.979 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274745 s, 14.9 MB/s 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:23.979 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:24.251 1+0 records in 00:28:24.251 1+0 records out 00:28:24.251 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253731 s, 16.1 MB/s 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:24.251 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:24.516 1+0 records in 00:28:24.516 1+0 records out 00:28:24.516 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244525 s, 16.8 MB/s 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:24.516 17:40:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:25.085 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:28:25.085 { 00:28:25.085 "nbd_device": "/dev/nbd0", 00:28:25.085 "bdev_name": "crypto_ram" 00:28:25.085 }, 00:28:25.085 { 00:28:25.085 "nbd_device": "/dev/nbd1", 00:28:25.085 "bdev_name": "crypto_ram2" 00:28:25.085 }, 00:28:25.085 { 00:28:25.085 "nbd_device": "/dev/nbd2", 00:28:25.085 "bdev_name": "crypto_ram3" 00:28:25.085 }, 00:28:25.085 { 00:28:25.085 "nbd_device": "/dev/nbd3", 00:28:25.085 "bdev_name": "crypto_ram4" 00:28:25.085 } 00:28:25.085 ]' 00:28:25.085 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:28:25.085 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:28:25.085 { 00:28:25.085 "nbd_device": "/dev/nbd0", 00:28:25.085 "bdev_name": "crypto_ram" 00:28:25.085 }, 00:28:25.085 { 00:28:25.085 "nbd_device": "/dev/nbd1", 00:28:25.085 "bdev_name": "crypto_ram2" 00:28:25.085 }, 00:28:25.085 { 00:28:25.085 "nbd_device": "/dev/nbd2", 00:28:25.085 "bdev_name": "crypto_ram3" 00:28:25.085 }, 00:28:25.085 { 00:28:25.085 "nbd_device": "/dev/nbd3", 00:28:25.085 "bdev_name": "crypto_ram4" 00:28:25.085 } 00:28:25.085 ]' 00:28:25.085 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:28:25.085 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:28:25.085 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:25.085 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:28:25.085 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:25.086 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:25.086 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:25.086 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:25.344 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:25.604 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:25.864 17:40:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:25.864 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:26.124 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:28:26.383 /dev/nbd0 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:26.383 1+0 records in 00:28:26.383 1+0 records out 00:28:26.383 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247914 s, 16.5 MB/s 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:26.383 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:28:26.642 /dev/nbd1 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:26.642 1+0 records in 00:28:26.642 1+0 records out 00:28:26.642 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294752 s, 13.9 MB/s 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:26.642 17:40:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:28:26.901 /dev/nbd10 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:26.901 1+0 records in 00:28:26.901 1+0 records out 00:28:26.901 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287765 s, 14.2 MB/s 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:26.901 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:28:27.160 /dev/nbd11 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:27.160 1+0 records in 00:28:27.160 1+0 records out 00:28:27.160 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254165 s, 16.1 MB/s 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:27.160 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:28:27.728 { 00:28:27.728 "nbd_device": "/dev/nbd0", 00:28:27.728 "bdev_name": "crypto_ram" 00:28:27.728 }, 00:28:27.728 { 00:28:27.728 "nbd_device": "/dev/nbd1", 00:28:27.728 "bdev_name": "crypto_ram2" 00:28:27.728 }, 00:28:27.728 { 00:28:27.728 "nbd_device": "/dev/nbd10", 00:28:27.728 "bdev_name": "crypto_ram3" 00:28:27.728 }, 00:28:27.728 { 00:28:27.728 "nbd_device": "/dev/nbd11", 00:28:27.728 "bdev_name": "crypto_ram4" 00:28:27.728 } 00:28:27.728 ]' 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:28:27.728 { 00:28:27.728 "nbd_device": "/dev/nbd0", 00:28:27.728 "bdev_name": "crypto_ram" 00:28:27.728 }, 00:28:27.728 { 00:28:27.728 "nbd_device": "/dev/nbd1", 00:28:27.728 "bdev_name": "crypto_ram2" 00:28:27.728 }, 00:28:27.728 { 00:28:27.728 "nbd_device": "/dev/nbd10", 00:28:27.728 "bdev_name": "crypto_ram3" 00:28:27.728 }, 00:28:27.728 { 00:28:27.728 "nbd_device": "/dev/nbd11", 00:28:27.728 "bdev_name": "crypto_ram4" 00:28:27.728 } 00:28:27.728 ]' 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:28:27.728 /dev/nbd1 00:28:27.728 /dev/nbd10 00:28:27.728 /dev/nbd11' 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:28:27.728 /dev/nbd1 00:28:27.728 /dev/nbd10 00:28:27.728 /dev/nbd11' 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:28:27.728 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:28:27.728 256+0 records in 00:28:27.728 256+0 records out 00:28:27.728 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125452 s, 83.6 MB/s 00:28:27.729 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:27.729 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:28:27.729 256+0 records in 00:28:27.729 256+0 records out 00:28:27.729 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0413241 s, 25.4 MB/s 00:28:27.729 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:27.729 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:28:27.729 256+0 records in 00:28:27.729 256+0 records out 00:28:27.729 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.04444 s, 23.6 MB/s 00:28:27.729 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:27.729 17:40:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:28:27.729 256+0 records in 00:28:27.729 256+0 records out 00:28:27.729 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0413287 s, 25.4 MB/s 00:28:27.729 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:27.729 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:28:27.988 256+0 records in 00:28:27.988 256+0 records out 00:28:27.988 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0349248 s, 30.0 MB/s 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:27.988 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:28.247 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:28.524 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:28.842 17:40:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:28.842 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:28.842 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:28.842 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:28:29.106 malloc_lvol_verify 00:28:29.106 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:28:29.365 4affc75c-62fc-4026-8e6b-6419ebf981aa 00:28:29.365 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:28:29.624 7862f252-d7f6-4f4c-ab41-5ce2f335456b 00:28:29.624 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:28:29.624 /dev/nbd0 00:28:29.883 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:28:29.883 mke2fs 1.46.5 (30-Dec-2021) 00:28:29.883 Discarding device blocks: 0/4096 done 00:28:29.883 Creating filesystem with 4096 1k blocks and 1024 inodes 00:28:29.883 00:28:29.883 Allocating group tables: 0/1 done 00:28:29.883 Writing inode tables: 0/1 done 00:28:29.883 Creating journal (1024 blocks): done 00:28:29.883 Writing superblocks and filesystem accounting information: 0/1 done 00:28:29.883 00:28:29.883 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:28:29.883 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:28:29.883 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:29.883 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:29.883 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:29.883 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:29.883 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:29.883 17:40:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:29.883 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:29.883 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:29.883 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:29.883 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:29.883 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2943182 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2943182 ']' 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2943182 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:29.884 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2943182 00:28:30.144 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:30.144 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:30.144 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2943182' 00:28:30.144 killing process with pid 2943182 00:28:30.144 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2943182 00:28:30.144 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2943182 00:28:30.404 17:40:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:28:30.404 00:28:30.404 real 0m9.395s 00:28:30.404 user 0m13.150s 00:28:30.404 sys 0m2.636s 00:28:30.404 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:30.404 17:40:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:30.404 ************************************ 00:28:30.404 END TEST bdev_nbd 00:28:30.404 ************************************ 00:28:30.404 17:40:41 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:28:30.404 17:40:41 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:28:30.404 17:40:41 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:28:30.404 17:40:41 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:28:30.404 17:40:41 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:28:30.404 17:40:41 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:28:30.404 17:40:41 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:30.404 17:40:41 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:30.405 ************************************ 00:28:30.405 START TEST bdev_fio 00:28:30.405 ************************************ 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:30.405 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:30.405 17:40:41 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:30.668 ************************************ 00:28:30.668 START TEST bdev_fio_rw_verify 00:28:30.668 ************************************ 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:30.668 17:40:41 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:30.929 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:30.929 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:30.929 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:30.929 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:30.929 fio-3.35 00:28:30.929 Starting 4 threads 00:28:45.824 00:28:45.824 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2945624: Mon Jul 15 17:40:54 2024 00:28:45.824 read: IOPS=33.0k, BW=129MiB/s (135MB/s)(1291MiB/10001msec) 00:28:45.824 slat (usec): min=14, max=383, avg=39.24, stdev=26.87 00:28:45.824 clat (usec): min=8, max=2106, avg=210.18, stdev=149.36 00:28:45.824 lat (usec): min=24, max=2261, avg=249.42, stdev=164.85 00:28:45.824 clat percentiles (usec): 00:28:45.824 | 50.000th=[ 169], 99.000th=[ 685], 99.900th=[ 840], 99.990th=[ 988], 00:28:45.824 | 99.999th=[ 1876] 00:28:45.824 write: IOPS=36.3k, BW=142MiB/s (149MB/s)(1380MiB/9730msec); 0 zone resets 00:28:45.824 slat (usec): min=15, max=760, avg=49.49, stdev=27.21 00:28:45.824 clat (usec): min=13, max=2523, avg=276.80, stdev=188.51 00:28:45.824 lat (usec): min=37, max=2784, avg=326.29, stdev=205.21 00:28:45.824 clat percentiles (usec): 00:28:45.824 | 50.000th=[ 235], 99.000th=[ 840], 99.900th=[ 1057], 99.990th=[ 1418], 00:28:45.824 | 99.999th=[ 2245] 00:28:45.824 bw ( KiB/s): min=114688, max=156553, per=97.40%, avg=141465.68, stdev=2597.47, samples=76 00:28:45.824 iops : min=28672, max=39138, avg=35366.37, stdev=649.35, samples=76 00:28:45.824 lat (usec) : 10=0.01%, 20=0.01%, 50=3.78%, 100=14.67%, 250=45.72% 00:28:45.824 lat (usec) : 500=25.57%, 750=8.56%, 1000=1.59% 00:28:45.824 lat (msec) : 2=0.09%, 4=0.01% 00:28:45.824 cpu : usr=99.73%, sys=0.00%, ctx=85, majf=0, minf=259 00:28:45.824 IO depths : 1=10.5%, 2=23.6%, 4=52.7%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:45.824 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:45.824 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:45.824 issued rwts: total=330526,353307,0,0 short=0,0,0,0 dropped=0,0,0,0 00:28:45.824 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:45.824 00:28:45.824 Run status group 0 (all jobs): 00:28:45.824 READ: bw=129MiB/s (135MB/s), 129MiB/s-129MiB/s (135MB/s-135MB/s), io=1291MiB (1354MB), run=10001-10001msec 00:28:45.824 WRITE: bw=142MiB/s (149MB/s), 142MiB/s-142MiB/s (149MB/s-149MB/s), io=1380MiB (1447MB), run=9730-9730msec 00:28:45.824 00:28:45.824 real 0m13.317s 00:28:45.824 user 0m49.336s 00:28:45.824 sys 0m0.391s 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:28:45.824 ************************************ 00:28:45.824 END TEST bdev_fio_rw_verify 00:28:45.824 ************************************ 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:28:45.824 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4ccd353f-901d-55a8-9757-eae9412c9b0f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4ccd353f-901d-55a8-9757-eae9412c9b0f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "e75eadcb-bbb4-562a-be3a-df290740e004"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e75eadcb-bbb4-562a-be3a-df290740e004",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f559dfe7-89b0-5d7d-91f9-459fd16c3cb2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f559dfe7-89b0-5d7d-91f9-459fd16c3cb2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "13c076e8-efc3-503d-81ff-470dde760cc9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "13c076e8-efc3-503d-81ff-470dde760cc9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:28:45.825 crypto_ram2 00:28:45.825 crypto_ram3 00:28:45.825 crypto_ram4 ]] 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4ccd353f-901d-55a8-9757-eae9412c9b0f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4ccd353f-901d-55a8-9757-eae9412c9b0f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "e75eadcb-bbb4-562a-be3a-df290740e004"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e75eadcb-bbb4-562a-be3a-df290740e004",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f559dfe7-89b0-5d7d-91f9-459fd16c3cb2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f559dfe7-89b0-5d7d-91f9-459fd16c3cb2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "13c076e8-efc3-503d-81ff-470dde760cc9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "13c076e8-efc3-503d-81ff-470dde760cc9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:45.825 ************************************ 00:28:45.825 START TEST bdev_fio_trim 00:28:45.825 ************************************ 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:45.825 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:45.826 17:40:55 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:45.826 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:45.826 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:45.826 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:45.826 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:45.826 fio-3.35 00:28:45.826 Starting 4 threads 00:28:58.056 00:28:58.056 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2947950: Mon Jul 15 17:41:08 2024 00:28:58.056 write: IOPS=58.1k, BW=227MiB/s (238MB/s)(2269MiB/10001msec); 0 zone resets 00:28:58.056 slat (usec): min=14, max=561, avg=42.23, stdev=30.32 00:28:58.056 clat (usec): min=31, max=1028, avg=195.65, stdev=146.29 00:28:58.056 lat (usec): min=46, max=1234, avg=237.88, stdev=167.42 00:28:58.056 clat percentiles (usec): 00:28:58.056 | 50.000th=[ 141], 99.000th=[ 619], 99.900th=[ 717], 99.990th=[ 824], 00:28:58.056 | 99.999th=[ 1004] 00:28:58.056 bw ( KiB/s): min=224281, max=236528, per=100.00%, avg=232412.68, stdev=692.01, samples=76 00:28:58.056 iops : min=56070, max=59132, avg=58103.47, stdev=173.03, samples=76 00:28:58.056 trim: IOPS=58.1k, BW=227MiB/s (238MB/s)(2269MiB/10001msec); 0 zone resets 00:28:58.056 slat (usec): min=4, max=750, avg= 8.28, stdev= 4.22 00:28:58.056 clat (usec): min=46, max=1013, avg=169.19, stdev=79.00 00:28:58.056 lat (usec): min=51, max=1061, avg=177.47, stdev=79.94 00:28:58.056 clat percentiles (usec): 00:28:58.056 | 50.000th=[ 153], 99.000th=[ 416], 99.900th=[ 498], 99.990th=[ 586], 00:28:58.056 | 99.999th=[ 783] 00:28:58.056 bw ( KiB/s): min=224297, max=236528, per=100.00%, avg=232415.63, stdev=691.23, samples=76 00:28:58.056 iops : min=56074, max=59132, avg=58103.89, stdev=172.82, samples=76 00:28:58.056 lat (usec) : 50=3.34%, 100=20.81%, 250=55.46%, 500=17.66%, 750=2.71% 00:28:58.056 lat (usec) : 1000=0.02% 00:28:58.056 lat (msec) : 2=0.01% 00:28:58.056 cpu : usr=99.74%, sys=0.00%, ctx=50, majf=0, minf=102 00:28:58.056 IO depths : 1=8.5%, 2=22.3%, 4=55.3%, 8=13.8%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:58.056 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:58.056 complete : 0=0.0%, 4=87.9%, 8=12.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:58.056 issued rwts: total=0,580812,580812,0 short=0,0,0,0 dropped=0,0,0,0 00:28:58.056 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:58.056 00:28:58.056 Run status group 0 (all jobs): 00:28:58.056 WRITE: bw=227MiB/s (238MB/s), 227MiB/s-227MiB/s (238MB/s-238MB/s), io=2269MiB (2379MB), run=10001-10001msec 00:28:58.056 TRIM: bw=227MiB/s (238MB/s), 227MiB/s-227MiB/s (238MB/s-238MB/s), io=2269MiB (2379MB), run=10001-10001msec 00:28:58.056 00:28:58.056 real 0m13.277s 00:28:58.056 user 0m53.100s 00:28:58.056 sys 0m0.359s 00:28:58.056 17:41:08 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:58.056 17:41:08 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:28:58.056 ************************************ 00:28:58.056 END TEST bdev_fio_trim 00:28:58.056 ************************************ 00:28:58.056 17:41:08 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:28:58.056 17:41:08 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:28:58.056 17:41:08 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:58.056 17:41:08 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:28:58.056 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:58.056 17:41:08 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:28:58.056 00:28:58.056 real 0m26.944s 00:28:58.056 user 1m42.629s 00:28:58.056 sys 0m0.926s 00:28:58.056 17:41:08 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:58.056 17:41:08 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:58.057 ************************************ 00:28:58.057 END TEST bdev_fio 00:28:58.057 ************************************ 00:28:58.057 17:41:08 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:28:58.057 17:41:08 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:58.057 17:41:08 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:58.057 17:41:08 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:28:58.057 17:41:08 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:58.057 17:41:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:58.057 ************************************ 00:28:58.057 START TEST bdev_verify 00:28:58.057 ************************************ 00:28:58.057 17:41:08 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:58.057 [2024-07-15 17:41:08.685756] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:28:58.057 [2024-07-15 17:41:08.685814] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2949786 ] 00:28:58.057 [2024-07-15 17:41:08.777017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:58.057 [2024-07-15 17:41:08.847634] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:58.057 [2024-07-15 17:41:08.847639] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:58.057 [2024-07-15 17:41:08.868761] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:58.057 [2024-07-15 17:41:08.876802] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:58.057 [2024-07-15 17:41:08.884813] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:58.057 [2024-07-15 17:41:08.970753] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:59.973 [2024-07-15 17:41:11.247731] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:59.973 [2024-07-15 17:41:11.247823] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:59.973 [2024-07-15 17:41:11.247833] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.973 [2024-07-15 17:41:11.255741] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:59.973 [2024-07-15 17:41:11.255754] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:59.973 [2024-07-15 17:41:11.255760] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.973 [2024-07-15 17:41:11.263763] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:59.973 [2024-07-15 17:41:11.263773] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:59.973 [2024-07-15 17:41:11.263779] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:00.235 [2024-07-15 17:41:11.271786] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:29:00.235 [2024-07-15 17:41:11.271799] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:00.235 [2024-07-15 17:41:11.271805] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:00.235 Running I/O for 5 seconds... 00:29:05.522 00:29:05.522 Latency(us) 00:29:05.522 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:05.522 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:05.522 Verification LBA range: start 0x0 length 0x1000 00:29:05.522 crypto_ram : 5.05 590.23 2.31 0.00 0.00 216043.64 1613.19 127442.31 00:29:05.522 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:05.522 Verification LBA range: start 0x1000 length 0x1000 00:29:05.522 crypto_ram : 5.06 489.41 1.91 0.00 0.00 260230.13 2419.79 151640.22 00:29:05.522 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:05.522 Verification LBA range: start 0x0 length 0x1000 00:29:05.522 crypto_ram2 : 5.05 593.22 2.32 0.00 0.00 214555.26 2003.89 121796.14 00:29:05.522 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:05.522 Verification LBA range: start 0x1000 length 0x1000 00:29:05.522 crypto_ram2 : 5.07 493.95 1.93 0.00 0.00 257479.60 2697.06 145187.45 00:29:05.522 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:05.522 Verification LBA range: start 0x0 length 0x1000 00:29:05.522 crypto_ram3 : 5.04 4643.96 18.14 0.00 0.00 27345.71 2545.82 23391.31 00:29:05.522 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:05.522 Verification LBA range: start 0x1000 length 0x1000 00:29:05.522 crypto_ram3 : 5.05 3829.13 14.96 0.00 0.00 33108.85 6906.49 26416.05 00:29:05.522 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:05.522 Verification LBA range: start 0x0 length 0x1000 00:29:05.522 crypto_ram4 : 5.04 4643.08 18.14 0.00 0.00 27296.41 2760.07 22786.36 00:29:05.522 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:05.522 Verification LBA range: start 0x1000 length 0x1000 00:29:05.522 crypto_ram4 : 5.05 3846.04 15.02 0.00 0.00 32915.72 2129.92 26214.40 00:29:05.522 =================================================================================================================== 00:29:05.522 Total : 19129.02 74.72 0.00 0.00 53183.70 1613.19 151640.22 00:29:05.522 00:29:05.522 real 0m8.068s 00:29:05.522 user 0m15.423s 00:29:05.522 sys 0m0.348s 00:29:05.522 17:41:16 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:05.522 17:41:16 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:29:05.522 ************************************ 00:29:05.522 END TEST bdev_verify 00:29:05.522 ************************************ 00:29:05.522 17:41:16 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:29:05.522 17:41:16 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:05.522 17:41:16 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:29:05.522 17:41:16 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:05.522 17:41:16 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:05.522 ************************************ 00:29:05.522 START TEST bdev_verify_big_io 00:29:05.522 ************************************ 00:29:05.522 17:41:16 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:05.783 [2024-07-15 17:41:16.825222] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:05.783 [2024-07-15 17:41:16.825270] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2951022 ] 00:29:05.783 [2024-07-15 17:41:16.912870] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:05.783 [2024-07-15 17:41:16.990294] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:05.783 [2024-07-15 17:41:16.990300] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:05.783 [2024-07-15 17:41:17.011492] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:29:05.783 [2024-07-15 17:41:17.019516] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:05.783 [2024-07-15 17:41:17.027539] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:06.042 [2024-07-15 17:41:17.114299] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:29:08.579 [2024-07-15 17:41:19.275541] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:29:08.579 [2024-07-15 17:41:19.275605] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:08.579 [2024-07-15 17:41:19.275613] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:08.579 [2024-07-15 17:41:19.283556] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:29:08.579 [2024-07-15 17:41:19.283567] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:08.579 [2024-07-15 17:41:19.283572] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:08.579 [2024-07-15 17:41:19.291578] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:29:08.579 [2024-07-15 17:41:19.291588] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:08.579 [2024-07-15 17:41:19.291594] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:08.579 [2024-07-15 17:41:19.299595] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:29:08.579 [2024-07-15 17:41:19.299606] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:08.579 [2024-07-15 17:41:19.299612] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:08.579 Running I/O for 5 seconds... 00:29:09.150 [2024-07-15 17:41:20.196824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.197231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.197365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.197408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.197445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.197737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.197749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.199454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.199520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.199556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.199606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.200254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.200295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.200333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.200370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.200854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.200865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.202432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.202479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.202520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.202561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.203107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.203147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.203194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.203230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.203671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.203681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.204926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.204969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.205007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.205044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.205614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.205653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.205690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.205732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.206146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.206156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.207272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.207319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.207359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.207395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.207934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.207991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.208027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.208067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.208548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.208558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.209642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.209687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.209729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.209769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.210259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.210298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.210334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.210371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.210791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.210802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.212026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.212080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.212115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.212153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.212671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.212715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.212752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.212787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.213048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.213057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.213915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.213958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.213994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.214030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.214420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.214459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.214495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.214531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.214795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.214806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.215911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.215957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.215994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.216029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.216474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.216516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.216553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.216589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.216903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.216914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.217837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.217881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.217917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.217953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.218332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.218371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.218406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.218442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.218715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.218725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.222016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.222062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.222098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.223905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.223950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.223986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.225729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.225774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.225810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.227363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.227408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.227444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.229222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.229266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.229302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.231054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.231099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.231135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.232935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.232980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.233017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.234672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.234721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.234758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.236457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.236501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.236538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.238328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.238373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.238408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.240080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.240124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.240160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.242104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.242148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.242184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.243878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.243923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.243959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.245745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.245790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.245826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.247367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.247411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.247448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.249026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.249075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.249111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.150 [2024-07-15 17:41:20.250922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.250967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.251005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.252750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.252795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.252832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.255161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.255207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.255242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.256869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.256914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.256950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.259284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.259330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.259368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.259673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.259730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.259775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.259818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.260967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.262376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.263998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.265619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.266002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.266828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.267209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.268900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.271356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.272983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.274616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.276253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.276976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.277829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.279431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.281058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.283845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.285536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.287168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.287594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.289846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.291562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.293211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.294844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.297651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.299052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.299430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.300510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.302528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.304157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.305457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.307229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.308674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.309064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.310794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.312598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.314747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.316099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.317724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.319339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.321828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.323446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.325065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.326693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.328680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.330333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.332059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.333879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.337442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.339417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.341356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.342848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.344886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.346519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.347300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.347679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.350521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.351778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.353401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.355038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.357042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.357438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.358305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.359896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.362754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.364480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.366149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.367777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.368634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.370458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.372227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.373933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.376771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.378405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.379821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.380201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.382148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.383773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.385399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.386822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:09.151 [2024-07-15 17:41:20.387936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.389875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.390248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.390722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.401030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.402044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.403646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.413616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.415221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.416846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.428233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.429875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.431490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.443327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.151 [2024-07-15 17:41:20.445192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.446564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.456296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.458122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.459822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.468783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.470540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.472355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.481654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.483275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.484796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.495234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.496926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.497299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.504511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.504896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.505268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.509630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.510017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.510390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.514603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.515000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.515373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.519901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.520282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.520655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.524852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.525243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.525615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.530289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.530671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.531046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.535135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.535517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.535891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.540520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.540905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.541282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.545206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.545591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.545967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.552304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.552685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.553323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.560985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.562842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.563507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.574858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.575240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.575639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.585729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.586637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.586668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.588273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.590502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.591683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.593635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.595517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.596780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.596824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.596861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.596903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.597488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.597527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.597563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.597598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.598734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.598777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.598817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.598853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.599284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.599323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.599360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.599396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.600559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.600602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.600638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.600674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.601247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.601288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.601324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.601360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.602626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.602674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.602716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.602752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.603101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.603140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.603175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.603211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.604384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.604427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.604463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.604499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.604854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.604893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.604929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.604965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.606167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.606213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.606250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.439 [2024-07-15 17:41:20.606286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.606631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.606685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.606725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.606761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.607821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.607864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.607916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.607951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.608302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.608341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.608376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.608412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.610027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.610070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.610112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.610150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.610516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.610554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.610590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.610630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.611696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.611744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.611780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.611815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.612174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.612214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.612251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.613942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.615185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.615239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.615275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.616126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.616164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.616473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.616582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.616620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.616656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.616692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.616704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.617001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.617999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.618041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.618077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.618113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.618404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.618496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.618534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.618570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.618606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.618939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.619929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.619971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.620007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.620055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.620571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.620724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.620763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.620800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.620840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.621136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.622984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.623897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.623939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.623976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.624013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.624490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.624581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.624618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.624666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.624701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.625146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.625968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.626010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.626046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.626082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.626341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.626432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.626470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.626506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.626546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.626985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.627858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.627909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.627945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.627984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.628244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.628335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.628372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.628409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.628445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.628872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.629925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.629972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.630008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.630044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.630306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.630396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.630436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.630472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.630508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.630770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.631658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.631699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.631742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.631779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.632071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.632162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.632198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.632234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.632271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.632532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.633591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.633633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.633669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.633706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.633970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.634061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.634100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.634136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.634172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.634459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.635274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.635317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.635353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.635393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.635652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.635747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.635785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.635821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.635857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.636172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.637998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.638828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.638871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.638907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.638942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.639298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.639388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.639425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.440 [2024-07-15 17:41:20.639461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.639496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.639759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.640588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.640630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.640665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.640701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.641146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.641237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.641275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.641311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.641346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.641724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.642632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.642673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.642713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.642752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.643011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.643102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.643141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.643177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.643213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.643526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.644368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.644411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.644448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.644484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.644746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.644837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.644879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.644915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.644950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.645353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.646749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.646791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.646827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.646863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.647172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.647262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.647300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.647336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.647372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.647637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.648546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.648588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.648624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.648660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.649014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.649107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.649143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.649180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.649215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.649472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.650451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.650508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.650544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.650583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.650926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.651017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.651054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.651090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.651125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.651463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.652338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.652380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.652416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.652453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.652777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.652868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.652910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.652946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.652982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.653284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.654195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.654237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.654273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.654310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.654784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.654876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.654913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.654960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.654996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.655421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.656235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.656280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.656316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.656352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.656610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.656699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.656740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.656777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.656813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.657227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.658045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.659909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.660049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.660155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.660193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.660365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.661219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.662883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.664490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.666344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.666697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.666833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.668439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.670052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.671906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.672261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.675294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.676917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.678770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.680221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.680482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.682157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.683779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.685630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.686094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.686479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.688949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.690804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.691792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.693402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.693663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.695572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.697200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.697573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.698198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.698461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.701015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.702496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.704114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.705719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.705980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.706683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.707068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.708915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.710672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.711021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.713557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.715157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.716997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.718342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.718830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.719629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.441 [2024-07-15 17:41:20.721237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.722847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.724698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.724987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.727458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.729293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.729905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.730278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.730542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.732222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.733832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.735680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.736612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.736913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.739363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.739746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.740384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.741983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.742246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.744166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.745762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.747365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.748974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.749318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.750636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.752109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.753724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.755333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.755597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.756672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.758264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.759869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.761727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.762019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.765536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.767472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.769355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.771315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.771630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.773300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.774916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.776761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.777731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.778138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.780658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.782499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.783726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.785701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.785968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.786067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.787864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.789810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.790182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.790650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.791482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.793312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.795271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.796539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.796830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.798533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.798613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.799922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.801379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.803114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.803716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.803979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.804098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.805867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.806241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.806728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.806990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.808895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.810455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.811986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.813216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.813703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.814163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.814546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.814926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.815298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.815760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.817422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.817805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.818182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.818553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.819022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.819480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.819865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.820240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.820613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.821141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.822966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.711 [2024-07-15 17:41:20.823349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.823728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.824102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.824595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.825061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.825438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.825818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.826192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.826508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.828091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.828475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.828855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.829228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.829585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.830050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.830430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.830808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.831187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.831590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.833058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.833438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.833817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.834191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.834727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.835191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.835568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.835949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.836332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.836787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.838448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.838832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.839210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.839582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.840001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.840460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.840845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.841219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.841591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.842023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.843407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.843790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.844166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.844539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.844926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.845369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.845749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.846126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.846498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.846858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.848213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.848594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.848973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.849346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.849880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.850337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.850719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.851094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.851476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.851893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.853262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.853643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.854029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.854401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.854824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.855283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.855660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.856040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.856523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.856789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.859515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.860938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.861311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.861770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.862033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.863085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.864146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.864519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.865170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.865439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.866701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.868407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.870035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.871878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.872271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.873969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.875812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.877259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.877632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.878179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.880929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.882719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.884178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.885803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.886125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.888045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.888718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.889093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.890633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.890928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.892806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.712 [2024-07-15 17:41:20.894421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.896030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.897874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.898141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.898601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.899271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.900871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.902083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.902353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.904842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.906474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.908331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.909363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.909815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.910947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.912564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.914172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.916034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.916378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.918869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.920785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.921161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.921537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.921804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.923656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.925357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.927211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.928277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.928585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.930722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.931114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.931959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.933560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.933882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.935816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.937168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.939052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.940839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.941161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.942467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.944149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.945770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.947385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.947650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.948732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.950335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.951959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.953793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.954112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.957679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.959545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.961303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.963177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.963571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.964818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.966443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.968274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.969750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.970216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.973283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.975069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.976999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.978198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.978546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.980254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.982091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.983089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.983462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.983808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.986511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.987775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.989752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.989791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.990054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.991956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.993906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.994280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.994652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.994920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.997575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.997621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.997658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.997694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.997963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:20.999650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.001278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.003133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.003675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.004072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.004972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.005016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.005051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.005088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.005378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.005472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.713 [2024-07-15 17:41:21.005509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.714 [2024-07-15 17:41:21.005546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.714 [2024-07-15 17:41:21.005581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.714 [2024-07-15 17:41:21.005845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.714 [2024-07-15 17:41:21.006751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.714 [2024-07-15 17:41:21.006794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.714 [2024-07-15 17:41:21.006832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.714 [2024-07-15 17:41:21.006868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.007129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.007225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.007264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.007300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.007337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.007665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.009319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.009362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.009398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.009434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.009802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.009899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.009936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.009972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.010007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.010309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.011227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.011270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.011306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.011341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.011642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.011740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.011779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.011816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.011852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.012151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.013129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.013172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.013209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.013245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.013672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.013771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.013810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.013846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.013882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.014141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.014956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.014998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.015035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.015070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.015400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.015501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.015539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.015574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.015611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.015949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.016751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.016793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.016830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.016869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.017213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.017302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.978 [2024-07-15 17:41:21.017339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.017375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.017411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.017759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.018652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.018694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.018734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.018772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.019031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.019122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.019175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.019214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.019250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.019521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.020394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.020438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.020475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.020518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.020782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.020874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.020915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.020952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.022167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.022636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.025853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.025898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.025935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.025971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.026232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.026341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.026378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.026415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.026451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.026802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.027608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.027650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.027687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.027727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.027989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.029177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.029219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.029255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.029290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.029655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.030554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.030599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.030635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.030671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.030980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.031072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.031110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.031150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.031186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.031445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.032343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.032385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.032421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.032456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.032721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.032812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.032851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.032890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.032927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.033234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.034852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.034895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.034932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.034970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.035256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.035365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.035403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.035439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.035475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.035796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.036474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.036515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.036551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.036587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.036867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.036960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.036999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.037039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.037074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.037414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.038253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.038299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.038336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.038371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.038774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.038863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.038901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.038939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.038975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.039234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.040137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.040179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.979 [2024-07-15 17:41:21.040224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.040260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.040520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.040619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.040656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.040696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.040743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.041001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.041821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.041864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.041900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.041936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.042229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.042319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.042369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.042406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.042449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.042861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.043804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.043854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.043890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.043926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.044262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.044351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.044388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.044424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.044462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.044725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.045734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.045780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.045816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.045852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.046161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.046249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.046287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.046324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.046366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.046624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.047666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.047715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.047753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.047789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.048049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.048139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.048177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.048216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.048256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.048565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.049375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.049418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.049454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.049489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.049837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.049928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.049966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.050002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.050039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.050341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.051205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.051247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.051284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.051319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.051706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.051803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.051841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.051877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.980 [2024-07-15 17:41:21.051914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.052171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.053890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.054682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.054730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.054766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.054802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.055093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.055186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.055238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.055278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.055314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.055708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.056651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.056693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.056734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.056770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.057109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.057201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.057239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.057275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.057314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.057572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.058554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.058597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.058633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.058669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.058970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.059065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.059102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.059138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.059177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.059438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.060491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.060537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.060574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.060610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.060881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.060975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.061013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.061050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.061086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.061390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.062205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.062247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.062283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.062319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.062651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.062749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.062788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.062828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.062864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.063167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.064028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.064070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.064106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.064142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.981 [2024-07-15 17:41:21.064500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.064592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.064630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.064666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.064702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.064966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.065828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.065873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.065909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.067249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.067510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.067617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.067656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.067714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.067751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.068011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.068826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.069205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.069675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.071290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.071552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.071654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.071693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.071734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.071932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.074350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.075953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.077795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.078342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.078723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.078863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.080528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.082142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.083760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.084023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.086688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.088546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.089874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.090261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.090731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.092434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.094052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.095899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.097365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.097627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.100465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.101059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.101433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.102971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.103279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.104976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.105895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.107483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.109199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.109462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.111437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.112575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.113624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.114656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.115115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.115802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.117244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.118980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.120085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.120411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.121745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.122126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.982 [2024-07-15 17:41:21.122505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.122882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.123223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.123679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.124064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.124439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.124817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.125359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.126836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.127218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.127594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.127971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.128403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.128867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.129245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.129622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.130002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.130433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.131913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.132300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.132680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.133058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.133473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.133921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.134299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.134674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.135052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.135455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.136908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.137291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.137675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.138057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.138567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.139025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.139403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.139786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.140160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.140661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.142576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.142963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.143338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.143716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.144172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.144627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.145007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.145383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.145773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.146263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.147627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.148011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.148386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.148764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.149287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.149424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.149803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.150179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.150552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.150946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.152278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.152658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.983 [2024-07-15 17:41:21.153069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.153445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.153821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.154290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.154368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.155862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.156244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.156620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.156998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.157380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.157506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.157891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.158266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.158639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.159052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.161098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.161481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.161862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.162235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.162592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.163061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.163439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.163826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.164198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.164615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.166090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.166472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.168130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.169970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.170233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.170813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.172657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.174507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.175717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.176231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.179349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.179843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.181451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.183147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.183639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.184105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.185523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.187011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.187914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.188304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.191061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.191732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.193538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.195375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.195639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.196086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.196555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.198168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.984 [2024-07-15 17:41:21.199873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.200136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.202579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.204197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.206043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.206545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.206935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.208429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.210044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.211657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.213487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.213851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.216494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.218115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.218489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.218930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.219195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.221146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.223100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.224922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.226246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.226556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.228291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.228674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.229927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.231528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.231891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.233830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.234839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.236464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.238172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.238435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.239833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.241579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.243433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.245418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.245682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.246900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.248508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.250111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.251968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.252441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.255350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.256948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.258801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.259920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.260184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.262122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.263911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.265853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.266227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.266682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.269127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:09.985 [2024-07-15 17:41:21.270984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.253 [2024-07-15 17:41:21.272025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.253 [2024-07-15 17:41:21.273628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.253 [2024-07-15 17:41:21.273932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.253 [2024-07-15 17:41:21.275860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.253 [2024-07-15 17:41:21.276979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.277353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.278305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.278607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.280636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.282522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.284478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.286318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.286583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.287049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.287426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.289392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.291229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.291529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.294039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.295650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.297497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.298413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.298840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.299970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.301586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.303177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.305018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.305347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.308035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.309945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.310320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.310693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.310961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.313019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.314901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.316849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.318166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.318470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.320240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.320622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.321898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.323503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.323848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.325797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.326811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.328407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.330030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.330298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.331762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.333518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.335401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.337364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.337629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.338857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.340474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.342093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.343935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.344350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.347207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.348805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.350640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.351782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.352047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.353972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.355742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.357708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.358087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.358538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.361009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.362869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.363941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.365556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.365855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.367769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.368883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.369257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.370195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.370497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.372482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.374289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.376182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.378116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.378381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.378829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.379209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.381154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.383041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.383342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.385862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.387489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.389344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.389383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.389757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.390224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.391554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.393156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.254 [2024-07-15 17:41:21.394760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.395024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.397637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.397684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.397726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.397763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.398023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.399135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.399513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.400574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.402172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.402470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.403456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.403501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.403538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.403574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.403920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.405591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.407450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.408605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.408983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.409425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.410287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.410331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.410368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.410403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.410662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.410764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.410803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.410841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.410878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.411200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.412239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.412282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.412318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.412354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.412614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.412715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.412755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.412791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.412827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.413347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.414739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.414786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.414823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.414859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.415196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.415292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.415328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.415365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.415402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.415659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.416695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.416744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.416781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.416817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.417117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.417208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.417246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.417283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.417319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.417577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.418657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.418701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.418746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.418783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.419043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.419135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.419173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.419209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.419245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.419542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.420472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.420519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.420555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.420593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.420950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.421043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.421079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.421116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.421152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.421467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.422505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.422548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.422586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.422621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.423149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.423302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.423341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.423378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.423415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.423723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.424592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.424633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.424669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.424705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.425064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.425155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.425191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.255 [2024-07-15 17:41:21.425230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.425266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.425619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.427282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.427329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.427370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.427408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.427866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.427991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.428029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.428065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.429695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.429962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.430840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.430883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.430919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.430957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.431216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.431309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.431347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.431384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.431420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.431732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.432594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.432640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.432677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.432718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.433089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.434367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.434409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.434445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.434480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.434822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.435759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.435802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.435839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.435879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.436139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.436229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.436271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.436307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.436355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.436615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.437486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.437530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.437582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.437619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.438072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.438164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.438201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.438241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.438277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.438536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.439564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.439608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.439644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.439680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.439945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.440038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.440076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.440113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.440149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.440408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.441316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.441375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.441411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.441450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.441726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.441817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.441870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.441908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.441944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.442363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.443404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.443448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.443483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.443519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.443840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.443932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.443970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.444008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.444044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.444302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.445251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.445293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.445330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.445367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.445627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.445734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.445774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.445813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.445850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.446169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.447870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.447913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.447950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.447985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.448275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.448366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.448404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.256 [2024-07-15 17:41:21.448440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.448476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.448810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.449789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.449835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.449872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.449907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.450267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.450357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.450395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.450432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.450467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.450733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.451759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.451821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.451857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.451894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.452292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.452384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.452421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.452458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.452493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.452806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.453824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.453868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.453904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.453940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.454230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.454321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.454359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.454398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.454434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.454809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.456009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.456053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.456089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.456126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.456463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.456555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.456592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.456628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.456665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.457102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.458496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.458542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.458578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.458614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.458919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.459011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.459049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.459085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.459122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.459381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.460439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.460484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.460520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.460555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.460874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.460970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.461008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.461045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.461082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.461340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.462418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.462462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.462499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.462534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.462988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.463080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.463118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.463159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.463211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.463750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.464875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.464919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.464970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.465007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.465476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.465568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.465618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.465657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.465693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.466158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.467130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.467176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.467214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.467249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.467628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.467730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.467768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.467804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.467840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.468237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.469283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.469328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.469364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.469400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.469800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.469892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.469929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.257 [2024-07-15 17:41:21.469965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.470001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.470316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.471653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.471698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.471742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.471780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.472076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.472185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.472222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.472262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.472298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.472848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.474751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.474797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.474835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.475206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.475641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.475768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.475811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.475859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.475895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.476362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.477661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.477755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.477795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.477833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.477869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.478354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.479563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.479950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.480325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.480700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.481174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.481296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.481347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.481383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.481794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.483313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.483696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.484078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.484454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.484907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.485027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.485401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.485781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.486157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.486563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.488343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.488732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.489799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.490835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.491101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.491838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.492217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.492589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.494478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.494995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.496250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.497790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.498329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.500199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.500703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.501181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.502094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.503268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.504691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.505108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.507960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.508556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.510075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.510451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.510860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.512149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.513020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.514761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.515136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.515676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.517747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.518732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.519110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.519482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.258 [2024-07-15 17:41:21.519751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.259 [2024-07-15 17:41:21.520373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.259 [2024-07-15 17:41:21.521903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.259 [2024-07-15 17:41:21.522281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.259 [2024-07-15 17:41:21.522654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.259 [2024-07-15 17:41:21.522919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.259 [2024-07-15 17:41:21.524657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.604460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.614609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.614658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.616418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.616456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.616778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.617258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.619523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.621284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.623034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.624459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.624848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.626255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.626291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.521 [2024-07-15 17:41:21.628015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.629764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.631419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.631830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.631838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.631846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.634984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.636428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.638181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.639930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.641716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.643131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.644852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.646604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.646826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.646835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.646842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.648065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.649960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.651660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.653409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.654573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.656230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.657660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.659402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.659619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.659627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.659634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.660722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.661069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.662888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.664606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.666622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.667491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.669136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.670544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.670765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.670774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.670785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.671757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.672104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.672443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.674334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.676371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.678122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.679038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.680522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.680802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.680810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.680817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.683278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.683625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.683968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.684308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.685991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.687748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.689491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.690585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.690806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.690814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.690821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.693227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.695018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.695358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.695697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.697695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.699114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.700857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.702597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.702823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.702831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.702838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.705519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.707477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.709428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.709772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.710628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.712328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.713765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.715499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.715720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.715728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.715735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.717879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.719644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.721421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.723244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.724041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.724384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.726206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.727683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.727903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.727911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.727918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.729429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.730857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.732596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.734348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.734956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.522 [2024-07-15 17:41:21.735304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.735643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.737406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.737686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.737694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.737701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.740122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.740880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.742312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.744062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.746075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.746494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.746834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.747183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.747399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.747408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.747415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.750039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.751953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.752543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.754081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.756087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.757856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.758422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.758762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.759203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.759211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.759218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.761675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.763507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.765338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.766019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.767937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.769674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.771404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.771959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.772505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.772515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.772524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.774692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.776467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.778221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.779980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.782187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.783960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.785700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.787450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.787834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.787842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.787849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.789721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.791157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.792891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.794644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.795390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.796999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.798942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.800864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.801087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.801103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.801110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.802517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.803720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.805155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.806888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.808975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.809685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.811121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.812899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.813115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.813123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.813130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.814178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.814524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.815554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.523 [2024-07-15 17:41:21.816971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.819170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.821108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.821662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.823231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.823448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.823456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.823463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.825397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.825744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.826083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.826837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.828944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.830797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.831137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.832993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.833223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.833234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.833241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.835374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.835721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.836060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.836606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.837234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.838658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.838689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.840111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.840328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.840336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.840343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.841313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.841659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.842003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.842342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.842642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.844073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.844104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.845837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.846053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.846061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.846068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.847234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.847579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.847609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.847951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.848318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.848658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.848689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.849034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.849357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.849365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.786 [2024-07-15 17:41:21.849372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.850227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.850572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.850603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.850945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.851644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.851991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.852332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.852672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.852947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.852956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.852964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.854029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.854373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.854404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.854745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.855470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.855818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.856160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.856191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.856544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.856876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.856885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.856891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.856898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.858626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.858976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.859320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.860128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.860164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.860503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.860866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.860874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.860880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.861968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.862314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.862659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.862692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.864445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.864764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.864894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.865234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.865277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.865615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.866007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.866016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.866023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.866808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.867153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.867185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.867527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.868045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.868143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.868484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.868515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.868858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.869192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.869200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.869211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.870057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.870402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.870433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.870775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.871279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.871391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.871740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.871770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.872108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.872440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.872449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.872457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.873321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.873683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.873718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.875335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.875825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.875947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.876288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.876319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.876657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.876947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.876956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.876963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.877824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.879309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.879340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.879678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.879689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.880046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.880054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.880165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.880516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.880547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.882469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.882901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.882909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.882916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.883902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.787 [2024-07-15 17:41:21.883939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.884278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.884309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.884718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.884727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.884847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.886762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.886793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.887139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.887352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.887360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.887367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.888069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.888119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.888148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.888178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.888532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.888541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.888625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.888969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.889006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.890837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.891339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.891347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.891354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.892024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.892059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.892088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.892118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.892359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.892368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.892446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.892791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.892822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.893181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.893396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.893405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.893413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.894830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.895141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.895149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.895156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.896995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.897003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.897010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.898783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.899111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.899119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.899126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.899853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.899889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.899921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.899950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.900158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.900169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.900246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.900277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.900306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.900343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.900820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.900829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.900836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.901933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.901973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.902002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.902032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.902302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.902310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.902388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.902419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.788 [2024-07-15 17:41:21.902447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.902477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.902798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.902807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.902814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.903474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.903508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.903537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.903567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.903856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.903864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.903943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.903979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.904008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.904038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.904419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.904427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.904434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.905184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.905220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.905249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.905285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.905793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.905802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.905930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.905961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.905990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.906019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.906228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.906236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.906243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.907954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.908165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.908173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.908180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.908952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.908991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.909658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.910582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.910617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.910646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.910676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.910891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.910899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.910975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.911006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.911050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.911079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.911574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.911584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.911593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.912906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.913197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.913205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.913212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.914856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.916034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.916069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.916099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.789 [2024-07-15 17:41:21.916128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.916405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.916413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.916491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.916522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.916551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.916580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.916798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.916807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.916814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.917496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.917530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.917560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.917589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.917824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.917833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.917911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.917941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.917970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.918001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.918213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.918222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:21.918229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.061751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.062085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.062126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.064002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.064950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.066824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.066861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.067850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.069281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.070744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.070956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.070964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.071062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.072777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.073134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.073476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.074006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.074218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.076713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.078352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.079560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.080972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:10.790 [2024-07-15 17:41:22.081183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.081193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.083231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.085164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.085513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.085853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.086198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.088380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.090132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.091426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.092862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.093178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.093187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.094706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.096459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.098106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.098444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.053 [2024-07-15 17:41:22.098674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.100941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.102370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.104114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.105125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.105337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.105349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.106823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.108263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.110002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.111386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.111714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.114258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.116149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.117721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.119459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.119845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.119854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.121896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.123532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.124954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.126697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.127005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.128639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.130038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.131650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.133596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.133811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.133819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.134582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.136171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.138100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.139770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.139981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.141095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.142243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.143667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.145099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.145311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.145318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.147239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.147949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.149394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.151147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.151359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.152323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.152668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.153916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.155360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.155573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.155581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.157535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.159420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.160387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.161816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.162026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.163930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.164281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.164618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.164649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.164864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.164873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.166372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.167794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.167825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.169569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.169785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.172288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.172331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.174024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.174054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.174265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.174273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.174365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.174886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.174917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.175254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.175672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.177830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.177867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.179607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.179637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.179924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.179935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.180029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.181722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.181753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.183248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.183523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.184519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.184557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.184899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.184930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.185225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.185234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.185327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.186750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.186780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.188367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.188587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.190780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.190817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.192250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.192281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.192553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.192561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.192655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.194389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.194420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.195416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.195850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.197942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.197979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.199543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.199573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.054 [2024-07-15 17:41:22.199786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.199794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.199903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.201845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.201877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.202852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.203063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.205532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.205569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.206585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.206628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.207114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.207124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.207232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.207575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.207605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.209583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.209854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.212443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.212481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.213453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.213483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.213744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.213752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.213858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.215329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.215360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.217086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.217299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.218787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.218824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.220681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.220718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.220975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.220984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.221077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.222508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.222539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.224276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.224565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.226984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.227021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.228922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.228953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.229167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.229176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.229271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.229610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.229641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.229983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.230261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.232436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.232473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.234035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.234066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.234351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.234360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.234467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.235897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.235927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.237665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.237880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.238890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.238929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.240812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.240841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.241052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.241060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.241156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.241825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.241857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.243427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.243638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.245999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.246037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.246382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.246413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.246761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.246769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.246887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.247847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.247879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.249163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.249375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.251845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.251882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.253401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.253432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.253643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.253651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.253750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.254438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.254469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.254810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.255282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.257907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.257944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.258953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.258984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.259285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.055 [2024-07-15 17:41:22.259294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.259390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.260862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.260893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.262605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.262823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.264313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.264350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.265728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.265758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.266029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.266038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.266133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.267374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.267405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.268550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.268849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.269992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.270029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.270368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.270399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.270611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.270620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.270741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.272037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.272068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.272760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.273024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.274035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.274072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.274410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.274441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.274758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.274767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.274863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.276636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.276669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.278544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.278891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.281068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.281105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.281444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.281475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.281884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.281892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.281997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.282336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.282367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.283664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.283939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.285273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.285620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.285652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.285994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.286095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.287873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.287904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.289339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.289654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.290730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.290767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.292609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.292647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.292886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.292984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.293629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.293661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.295449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.295663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.296719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.296755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.297093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.297127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.297502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.297602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.297947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.297978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.298316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.298772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.301634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.301672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.302013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.302044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.302053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.302289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.302298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.302397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.302739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.302770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.303404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.303617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.304471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.304820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.304851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.305190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.305400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.305409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.305518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.306023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.306055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.056 [2024-07-15 17:41:22.306392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.306892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.307856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.307893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.307922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.307951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.308240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.308250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.308334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.308366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.308396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.308426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.308774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.309622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.309657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.309687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.309721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.310061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.310070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.310151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.310182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.310213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.310242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.310451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.311526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.311561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.311603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.311632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.312148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.312157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.312290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.312322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.312352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.312386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.312818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.313689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.313728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.313758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.313787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.314203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.314211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.314290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.314320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.314349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.314379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.314836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.315831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.315867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.315898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.315927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.316256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.316265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.316343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.316375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.316405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.316434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.316777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.317998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.318036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.318066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.318095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.318477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.318486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.318559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.318589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.318618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.318647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.319028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.319764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.319799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.319828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.319867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.320216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.320225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.320299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.320334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.320363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.320392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.320809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.322379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.322417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.057 [2024-07-15 17:41:22.322447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.322478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.322820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.322837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.322935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.322966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.322995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.323024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.323403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.324353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.324389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.324731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.324762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.325121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.325130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.325248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.325283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.325622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.325652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.325886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.327124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.327160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.327500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.327531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.327869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.327878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.328294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.328329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.328668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.328705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.329050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.330896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.330933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.331270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.331301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.331680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.331689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.332115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.332152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.332489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.332521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.332866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.334335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.334372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.334714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.334746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.335032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.335041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.335452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.335485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.335827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.335858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.336146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.337488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.337524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.337867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.337899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.338176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.338185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.058 [2024-07-15 17:41:22.338599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.502525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.502600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.504163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.508291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.510144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.512088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.513022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.513072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.514813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.514863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.516631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.518567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.518831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.518842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.521658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.523491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.524621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.526510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.528666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.529499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.531110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.532931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.533193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.533203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.535564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.537416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.539261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.540307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.542063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.542741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.544516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.546177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.546440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.546450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.549034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.550857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.552701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.553866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.554821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.556222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.557205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.558812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.559076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.559086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.561770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.563562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.565279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.567133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.568861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.570369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.570955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.572813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.573141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.573152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.574753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.576361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.578190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.580046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.582353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.582974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.584453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.585311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.585657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.585668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.587865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.589465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.591312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.592175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.593505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.594563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.595942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.597438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.597768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.319 [2024-07-15 17:41:22.597778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.599493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.601017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.602115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.603104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.604933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.605910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.607845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.609571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.609953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.609963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.611819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.613248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.614763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.320 [2024-07-15 17:41:22.615597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.617579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.618450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.619703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.619744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.620008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.620018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.622424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.623208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.625050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.627002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.628188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.628231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.629221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.629264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.629525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.629535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.632104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.633959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.634392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.636145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.637784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.637828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.638379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.638416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.638676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.638686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.641658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.642359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.644101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.645450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.646205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.646247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.647733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.647771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.648168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.648179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.649863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.651602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.652686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.652728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.653810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.584 [2024-07-15 17:41:22.653853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.654938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.654980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.655286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.655295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.657680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.657728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.659531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.659569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.661236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.661279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.662870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.663354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.663731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.663742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.666612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.666657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.668477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.668514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.668971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.669628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.669678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.671478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.671922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.671932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.673144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.673188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.674915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.674952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.675405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.675785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.675822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.676466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.676737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.676748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.679349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.679394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.680270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.680308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.680668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.681847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.681885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.682606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.682955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.682966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.685521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.685565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.687466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.687504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.688026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.688953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.688990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.690818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.691259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.691270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.692635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.692679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.694404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.694442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.694806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.696490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.696529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.697081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.697348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.697358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.700121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.700166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.701050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.702777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.703237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.705120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.705158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.705910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.706175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.706186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.707023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.708351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.708398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.708409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.708443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.708796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.710599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.710639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.711546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.711584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.711953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.711964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.712928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.712971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.713007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.713044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.713304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.715115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.715156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.715848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.715887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.585 [2024-07-15 17:41:22.716149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.716159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.717358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.718983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.719022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.719034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.719426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.721232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.721274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.722511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.722549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.722868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.722879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.723875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.723918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.723954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.723990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.724359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.725878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.725918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.726330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.726368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.726631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.726642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.727638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.727681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.727724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.727761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.728032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.729217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.729268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.731048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.731087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.731478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.731488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.732382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.732424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.732460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.732497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.732768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.733520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.733561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.735353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.737130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.737485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.737494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.738354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.738396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.738775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.740742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.741023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.741130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.741912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.741952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.743686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.744029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.744039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.746628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.746674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.746721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.746759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.747107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.747230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.748965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.749004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.749402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.749664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.749674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.750594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.750636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.750673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.750714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.751263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.751404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.752748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.752787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.753764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.754053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.754063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.754965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.755009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.755045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.755086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.755450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.755538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.756657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.756696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.758370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.758926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.758938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.759849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.759897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.759935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.759971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.760339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.760429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.762175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.762213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.763262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.586 [2024-07-15 17:41:22.763526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.763536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.764744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.764787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.764825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.764861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.765218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.765309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.765682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.765726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.767294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.767678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.767689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.768599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.768643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.768679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.768720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.769145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.769242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.770993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.771031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.771073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.771398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.771409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.772355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.772398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.772436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.772473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.773000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.773149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.773187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.773225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.773262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.773632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.773642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.774639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.774686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.774729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.774766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.775266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.775434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.775474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.775512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.775549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.775861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.775872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.776862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.776905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.776941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.776977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.777360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.777465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.777506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.777542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.777578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.777895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.777905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.779939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.780831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.780874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.780910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.780946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.781296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.781385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.781422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.781458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.781493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.781949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.781959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.782971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.783829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.785141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.785185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.785226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.785263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.785679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.785772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.785810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.785845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.785881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.587 [2024-07-15 17:41:22.786315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.786326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.787609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.787654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.787691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.787737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.787997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.788104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.788142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.788178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.789661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.790091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.790101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.791058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.791105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.791140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.791176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.791624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.791715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.793486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.793525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.795275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.795677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.795687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.796366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.796412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.796449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.796485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.796879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.796968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.797985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.798023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.799786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.800280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.800290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.801115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.801161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.801199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.801234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.801546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.801632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.803126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.803164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.804226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.804493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.804504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.807903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.807949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.809624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.809662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.809964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.810073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.810843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.810882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.811986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.812261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.812271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.814705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.814754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.815342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.815379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.815640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.815760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.816670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.816713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.816751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.817172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.817182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.819482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.819526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.820101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.820140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.820402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.822395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.822436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.823729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.823767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.824177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.824186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.588 [2024-07-15 17:41:22.828982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.829028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.830774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.830811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.831149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.833188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.833229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.834035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.834072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.834400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.834411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.839109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.839155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.840030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.840067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.840345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.841658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.841699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.842334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.842371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.842762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.842772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.848185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.848231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.849086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.849123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.849384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.849841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.849888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.851621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.851658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.851923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.851941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.855619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.855665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.856744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.856782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.857084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.858897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.858938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.859686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.859727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.859989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.859999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.864101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.864553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.864591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.866391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.866652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.867144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.867185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.868913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.868951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.869260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.869270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.874130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.876097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.589 [2024-07-15 17:41:22.878007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.878703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.878971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.879082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.880315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.880359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.882257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.882775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.882784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.888382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.890122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.890160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.890196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.890581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.890703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.892292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.892330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.893485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.893841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.893851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.898576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.898661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.898771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.900460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.900499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.901069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.901332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.901343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.904541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.906339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.908104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.909436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.909854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.909977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.911904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.911943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.912315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.912578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.912588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.918743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.919167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.920884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.921296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.921558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.921665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.923230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.923269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.924519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.924785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.924796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.928161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.930003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.931096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.932627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.932657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.932923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.932933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.933038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.934045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.934084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.934121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.934560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.934570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.939111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.940979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.942837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.942876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.942905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.943194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.943205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.943213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.943903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.943945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.945658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.945697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.946085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.851 [2024-07-15 17:41:22.946096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.949396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.949483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.949493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.949501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.950745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.950787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.951490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.951538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.951804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.951815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.955005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.956744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.958081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.959799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.960358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.960374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.960384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.962373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.962414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.962792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.962830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.963091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.963102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.964991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.966742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.967617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.969340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.969780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.969790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.969799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.971141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.971182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.972231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.972269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.972586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.972596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.973792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.975419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.975930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.977752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.978016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.978028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.978036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.979441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.979482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.981366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.981403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.981665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.981676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.987145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.988746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.990592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.990905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.990915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.992912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.992953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.994789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.994831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.995091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.995102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:22.999127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.000748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.002345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.004184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.004448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.004458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.006105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.006187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.006197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.011229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.012941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.013716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.015320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.015596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.015606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.015658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.017512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.019087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.020672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.020978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.020989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.025483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.026389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.027844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.029430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.029735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.029746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.031672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.032628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.034225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.035828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.036091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.036101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.039964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.041578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.043181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.045019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.045314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.045324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.047243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.048989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.050647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.052487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.052884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.052895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.057672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.059423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.061329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.062534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.062852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.062862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.064536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.066386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.067477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.069082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.069477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.069488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.075261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.076364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.078232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.080183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.080448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.080459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.082416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.083705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.084490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.086286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.086769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.086781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.092650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.094254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.852 [2024-07-15 17:41:23.095846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.097445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.097929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.097939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.099964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.100463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.102028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.103947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.104369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.104380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.109807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.110189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.111902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.113697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.114049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.114059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.115402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.117052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.118597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.119145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.119408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.119418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.123760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.125197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.126974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.127414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.127675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.127686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.128371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.129834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.131802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.132819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.133126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.133136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.139384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.139905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.141516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.143173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.143437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.143447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.144708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.146321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:11.853 [2024-07-15 17:41:23.146359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.147865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.148236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.148246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.155203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.156925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.158884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.159542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.159811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.159822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.159944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.161148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.161187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.162790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.163179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.163189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.168446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.169969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.171085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.172571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.172930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.172940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.173059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.173940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.173979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.175402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.175739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.175750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.180559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.181953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.182657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.184104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.184459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.184468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.184586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.185291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.185330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.186909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.187172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.187182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.191515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.193322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.193363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.195152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.195554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.195564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.195682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.197181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.197221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.198636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.198904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.198914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.200277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.200657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.200695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.202223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.202491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.202502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.202614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.203911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.205417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.205455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.205796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.205807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.209664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.211183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.211223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.212174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.212438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.212448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.214175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.214217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.215458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.215497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.215940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.215951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.220280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.220944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.220984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.222489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.222759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.222769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.224458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.224500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.224989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.225027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.225293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.225304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.228560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.230027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.230067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.231166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.231431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.231441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.232546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.232589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.233426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.233464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.233824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.233834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.237825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.238914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.238954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.239867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.240133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.240143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.240671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.240732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.242583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.242622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.242888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.242899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.246108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.246755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.246795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.248040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.248356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.248369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.249946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.249988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.251236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.251276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.251559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.251570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.256376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.256421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.257844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.257882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.258178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.258189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.258921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.258963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.260615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.262489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.262861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.262872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.266875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.266922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.266959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.266995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.267319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.267329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.267438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.268955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.268993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.270089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.270364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.270374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.273786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.273832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.274473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.276193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.276458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.276469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.276578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.278103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.278141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.278551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.278819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.278830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.281961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.283470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.283510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.283546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.284017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.284028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.284165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.285475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.285514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.286965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.287401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.287412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.290945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.290991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.291028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.291064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.291380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.291391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.291482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.292147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.292199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.294057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.294564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.294575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.297830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.118 [2024-07-15 17:41:23.297876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.297914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.297951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.298281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.298291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.298378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.300278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.300320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.301029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.301294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.301304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.303242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.303624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.303662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.303700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.304056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.304068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.304214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.304589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.304969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.305007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.305432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.305443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.307527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.307921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.308295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.308669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.309131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.309142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.309594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.309648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.310028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.310068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.310524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.310535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.313093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.313139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.313201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.313238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.313637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.313648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.314131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.314172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.314545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.314589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.314859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.314871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.319149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.319213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.319251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.319287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.319670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.319680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.320356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.320402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.321907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.321946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.322280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.322292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.325773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.325819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.325857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.325894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.326208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.326219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.327782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.327823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.328638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.328678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.328945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.328956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.332173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.332224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.332262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.332303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.332565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.332574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.333472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.333515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.334553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.334591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.335060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.335073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.336793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.336839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.336879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.336927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.337444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.337456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.338351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.338393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.339463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.339502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.339769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.339814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.343595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.343641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.343677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.343720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.343982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.343992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.345651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.345692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.345734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.345770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.346102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.346111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.349850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.349898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.349935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.349975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.350511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.350522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.350668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.350706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.350753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.350789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.351095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.351105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.354770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.354820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.354857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.354893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.355341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.355353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.355460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.355497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.355533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.355568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.355904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.355915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.360882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.364757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.365111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.365121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.367867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.367913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.367952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.367988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.368248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.368258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.368353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.368399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.368436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.368472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.368740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.368750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.371415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.371464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.371507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.371544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.371908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.371918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.372053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.372094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.372130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.372171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.372430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.372440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.375888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.376275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.376285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.119 [2024-07-15 17:41:23.379178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.379225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.379262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.379297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.379608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.379618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.379706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.379750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.381332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.381371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.381633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.381643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.384336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.384383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.384420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.384457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.384790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.384800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.385771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.385813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.387088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.387126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.387392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.387402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.390513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.390562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.390599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.390634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.390900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.390911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.392309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.392351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.392863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.392902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.393311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.393321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.396916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.396963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.396999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.397034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.397294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.397304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.397732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.397784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.398433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.398472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.398845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.398859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.403310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.403356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.404027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.404066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.404417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.404427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.405870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.405912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.407886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.407925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.408290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.408300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.412432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.120 [2024-07-15 17:41:23.412479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.414180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.414220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.414480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.414491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.414940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.414982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.415019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.415480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.415751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.415762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.420200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.420251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.420982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.421020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.421328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.421341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.421448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.422935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.422974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.424735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.425139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.425148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.429826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.429873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.430943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.430988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.431249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.431260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.431370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.433251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.433304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.433678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.434173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.434185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.438914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.438961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.439941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.439980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.440274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.440283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.440395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.442002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.442041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.443935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.444345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.444356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.449526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.449573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.451412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.451451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.451868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.451879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.452015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.453609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.453649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.455456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.455726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.455737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.461146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.461192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.461572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.461609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.462006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.462016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.462133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.463742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.463781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.465386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.465649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.465659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.469487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.471055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.471094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.471477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.471933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.471944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.472064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.473666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.473705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.473750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.474114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.474124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.478806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.480641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.482285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.482659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.483162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.483172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.484941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.484982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.486577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.382 [2024-07-15 17:41:23.486616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.486882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.486892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.492774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.493228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.493267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.493304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.493788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.493801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.495827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.495869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.497767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.497805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.498067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.498078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.504345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.504434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.504445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.504894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.504945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.505550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.505587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.505918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.505928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.510014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.511803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.513489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.515342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.515870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.515881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.516335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.516376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.518288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.518327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.518587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.518598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.524058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.525894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.526748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.527122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.527387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.527397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.528723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.528765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.529818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.529859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.530123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.530140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.534416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.535318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.536838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.538805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.539324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.539334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.539792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.539846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.539882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.541504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.541775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.541786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.545422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.546061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.547503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.549397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.549744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.549754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.549881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.551145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.551183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.552983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.553357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.553367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.555865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.555911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.555948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.556219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.556230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.556325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.557833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.557886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.559718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.559981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.559991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.564961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.566607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.568216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.570060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.570399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.570409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.570520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.572306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.572347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.572722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.573207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.573219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.577592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.578191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.579683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.580842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.383 [2024-07-15 17:41:23.581106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.581116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.581224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.582331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.582369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.583583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.583855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.583866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.589357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.591105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.591480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.591856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.592137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.592147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.592262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.593768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.593807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.594450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.594721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.594731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.599749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.600517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.602036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.603553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.604038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.604050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.604172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.604557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.604595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.606100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.606362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.606373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.610421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.612190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.614131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.615052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.615360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.615370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.615491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.616657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.617036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.617765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.618070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.618080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.622656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.623391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.624902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.626425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.626728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.626738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.628328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.629326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.629701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.630601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.630937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.630947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.635417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.636294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.637800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.639180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.639477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.639487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.641061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.641913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.642287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.643359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.643673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.643683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.647988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.649441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.650600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.651750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.652092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.652103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.653688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.654338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.654719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.656000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.656306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.656317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.660346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.661668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.663173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.664124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.664387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.664398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.666116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.666528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.666910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.668444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.668769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.668779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.672819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.673933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.674896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.676813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.677094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.384 [2024-07-15 17:41:23.677104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.677627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.678010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.679501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.681119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.681487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.681496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.686483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.686869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.688799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.689172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.689659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.689669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.690126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.690502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.690882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.691254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.691674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.691685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.693827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.694208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.694584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.694962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.695436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.695446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.695907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.696283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.696656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.697037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.697435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.697447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.701704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.702223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.702597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.704036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.704365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.704375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.705353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.707287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.707325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.709297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.709727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.647 [2024-07-15 17:41:23.709738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.714444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.716380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.716758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.717131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.717397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.717407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.717527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.719351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.719390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.720261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.720612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.720622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.725686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.726193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.727779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.728151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.728639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.728650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.728771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.730580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.730619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.731105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.731370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.731380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.736466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.737507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.739123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.740969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.741323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.741333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.741452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.741831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.741869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.743116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.743423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.743433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.749938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.750331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.750704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.752675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.752943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.752953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.753071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.754929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.754970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.756065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.756381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.756391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.759553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.760838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.760877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.762836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.763137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.763147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.763275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.763649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.764031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.764070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.764336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.764346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.768337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.768723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.768777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.769444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.769722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.769733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.771679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.771723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.773195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.773233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.773554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.773563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.776798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.778054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.778093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.779982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.780292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.780302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.780903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.780944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.648 [2024-07-15 17:41:23.781316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.781354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.781619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.781633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.785746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.786128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.786177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.786871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.787157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.787167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.789198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.789240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.790616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.790654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.791021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.791032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.794116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.795659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.795698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.797454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.797825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.797835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.798741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.798783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.799156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.799193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.799454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.799465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.803685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.804081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.804120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.804493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.804764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.804779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.806571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.806616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.807636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.807674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.808099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.808109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.811261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.812907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.812946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.814706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.815114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.815124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.816028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.816070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.816443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.818027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.818400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.818409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.822417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.822462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.824277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.824316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.824617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.824626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.824761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.825722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.825761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.827030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.827292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.827303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.831468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.831515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.831552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.831589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.832009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.832020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.832138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.833042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.833082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.833455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.833724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.833735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.836987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.837033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.838811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.839186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.839716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.839727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.649 [2024-07-15 17:41:23.839861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.841482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.841520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.843410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.843746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.843756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.846584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.848097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.848136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.848172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.848464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.848475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.848590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.850268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.850306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.851970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.852350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.852360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.855271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.855317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.855352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.855389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.855652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.855662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.855759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.856132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.856182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.856553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.856825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.856837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.860714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.860759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.860796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.860831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.861094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.861104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.861194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.861567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.861946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.861985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.862249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.862259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.866738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.868622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.868662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.868699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.868963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.868974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.870532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.870572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.870947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.870984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.871337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.871347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.872395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.874246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.875163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.876754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.877017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.877028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.878932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.878973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.880357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.880406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.880930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.880940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.883592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.883637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.883673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.883713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.883976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.883986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.885068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.885112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.886716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.886753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.887104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.887114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.888168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.888211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.888247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.888283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.888794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.650 [2024-07-15 17:41:23.888806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.890694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.890738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.892419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.892457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.892721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.892731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.893607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.893650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.893686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.893726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.894047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.894058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.895944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.895984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.897222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.897260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.897687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.897697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.898722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.898769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.898809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.898845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.899131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.899141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.900907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.900949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.901723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.901761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.902065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.902075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.902983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.903026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.903062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.903097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.903537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.903547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.903979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.904020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.904056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.904093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.904354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.904365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.905865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.906193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.906203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.907512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.907555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.907592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.907630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.907902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.907913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.908023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.908060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.908097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.908134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.908433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.908443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.909548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.909594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.909630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.909665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.910090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.910101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.910191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.910228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.910264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.910300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.910668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.910678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.911627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.911671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.911707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.911753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.912096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.912106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.651 [2024-07-15 17:41:23.912195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.912232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.912268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.912307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.912619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.912630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.913659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.913702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.913744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.913781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.914040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.914050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.914141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.914178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.914225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.914262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.914521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.914531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.915590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.915632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.915668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.915704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.916079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.916089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.916175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.916213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.916250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.916286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.916740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.916752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.917670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.917722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.917759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.917795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.918055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.918064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.918151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.918188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.918224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.918261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.918520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.918530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.919540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.919590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.919627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.919662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.920069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.920080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.920165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.920202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.921814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.921851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.922179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.922188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.923087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.923130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.923166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.923201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.923514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.923524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.923955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.924005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.924624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.924662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.652 [2024-07-15 17:41:23.925019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.925029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.925935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.925977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.926013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.926049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.926449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.926459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.926892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.926931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.928171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.928210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.928598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.928608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.930873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.930922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.930958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.930994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.931326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.931336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.932398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.932440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.933267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.933305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.933720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.933734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.935678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.935728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.935765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.935800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.936171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.936181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.936605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.936645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.938122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.938161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.938422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.938433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.941288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.653 [2024-07-15 17:41:23.941334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.941705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.941749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.942183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.942194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.944014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.944067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.944103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.945653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.945958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.945968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.947248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.947293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.948176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.948214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.948537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.948551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.948674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.949701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.949744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.951716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.951981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.951991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.954261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.954305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.955806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.955853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.956303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.956313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.956432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.958004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.958045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.959607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.960107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.960119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.962942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.962988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.963733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.963772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.964053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.964064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.964168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.965396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.965434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.965811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.966167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.966178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.915 [2024-07-15 17:41:23.968257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.968304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.969812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.969851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.970245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.970256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.970384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.970767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.970805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.972526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.972889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.972899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.975391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.975436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.975814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.975853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.976366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.976377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.976505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.978397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.978436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.980298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.980774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.980785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.981945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.981990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.982422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.982461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.982739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.982756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.982870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.984198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.984237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.984276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.984549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.984559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.985564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.985947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.985986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.987660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.987969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.987979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.988717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.988759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.990265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.990303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.990565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.990575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.992209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.994030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.995950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.996788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.997144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.997154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.998712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.998756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.999127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.999165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.999531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:23.999542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.002219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.003799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.003838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.003875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.004277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.004287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.004744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.004785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.006669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.006707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.007057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.007067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.008250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.008333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.008343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.009077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.009118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.010623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.010661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.011003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.011014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.011985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.012583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.012962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.014683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.015038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.015048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.016008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.016050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.017558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.017603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.017868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.017882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.020329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.021124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.022729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.023216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.023604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.023614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.025382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.916 [2024-07-15 17:41:24.025423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.025459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.025834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.026098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.026108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.028454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.028839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.029223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.029596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.030136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.030147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.030293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.030667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.030718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.031091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.031483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.031492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.033431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.033819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.034193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.034577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.034843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.034858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.034964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.036870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.036908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.037943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.038246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.038256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.039684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.039732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.039768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.040021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.040030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.040136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.041640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.041678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.043001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.043300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.043309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.044312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.044687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.045657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.047170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.047468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.047478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.047607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.049303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.049341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.051030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.051375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.051384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.053826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.054815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.056708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.057085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.057480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.057490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.057608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.058958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.058997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.059531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.059798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.059818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.062194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.062888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.064655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.066339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.066634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.066644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.066772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.067178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.067216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.067588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.067859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.067870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.069718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.071341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.072936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.074785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.075147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.075157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.075274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.075650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.076878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.078489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.078859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.078869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.081517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.083255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.085160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.085534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.086001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.086011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.087690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.089308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.091161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.092147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.092448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.917 [2024-07-15 17:41:24.092458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.093626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.094191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.095806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.097389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.097652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.097663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.098862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.100342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.101499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.101888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.102308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.102319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.104448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.105724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.106807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.107181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.107580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.107589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.108924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.110595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.112030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.113303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.113660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.113670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.116801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.118759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.119883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.121155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.121417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.121427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.121873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.122426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.123931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.125881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.126281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.126290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.127538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.127997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.129622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.131487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.131856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.131866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.133204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.134760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.135137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.135623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.135889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.135899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.138168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.139832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.140205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.140585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.140851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.140862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.142636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.143550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.144830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.146421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.146878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.146889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.151011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.152722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.153097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.153469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.153739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.153750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.155497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.156387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.157641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.159294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.159752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.159762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.162122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.162967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.164484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.166355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.166726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.166736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.167210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.169039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.169078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.170658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.171071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.171081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.172245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.172625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.174530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.175980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.176308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.176318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.176447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.177816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.177855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.179465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.179921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.179931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.184247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.185516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.186776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.918 [2024-07-15 17:41:24.187161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.187579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.187589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.187707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.188880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.188919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.190377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.190643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.190654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.192075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.192952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.194564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.196420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.196755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.196765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.196912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.198770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.198808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.200753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.201015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.201025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.203566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.205071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.205858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.207717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.207980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.207991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.208101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.208474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.208512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.208901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.209166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:12.919 [2024-07-15 17:41:24.209176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.211576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.212225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.212600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.214213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.214513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.214523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.214650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.216534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.217159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.217197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.217532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.217542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.218370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.218767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.218807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.219675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.220036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.220046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.221954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.221995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.223840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.223877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.224220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.224230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.225070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.226900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.182 [2024-07-15 17:41:24.226938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.228008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.228534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.228544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.229379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.229420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.231028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.231066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.231381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.231395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.232291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.233891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.233930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.235537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.235804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.235815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.237379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.237419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.237803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.237841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.238220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.238230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.239078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.240910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.240950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.241928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.242191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.242201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.244095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.244137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.245977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.246015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.246281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.246292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.247896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.249491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.249530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.251147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.251411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.251425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.252899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.252940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.254894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.254931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.255193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.255204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.256029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.256407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.256457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.257042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.257306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.257317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.259294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.259338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.260512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.261780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.262043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.262053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.263564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.265164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.265203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.267068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.267390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.267400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.267517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.268783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.268830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.269736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.270206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.270215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.273003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.273048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.274642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.274679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.275085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.275096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.275213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.275863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.275903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.276276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.276540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.276550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.279480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.279526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.279562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.279599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.279865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.279875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.279984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.281587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.281625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.282010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.282439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.282450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.283280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.283323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.284652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.286505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.286912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.183 [2024-07-15 17:41:24.286922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.287041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.288883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.288921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.289294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.289723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.289734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.290713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.292558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.292597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.292635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.292998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.293009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.293129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.294500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.294538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.294918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.295350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.295360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.296391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.296438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.296475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.296511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.296969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.296979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.297067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.297442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.297940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.297978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.298270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.298280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.299311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.299359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.299400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.299436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.299698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.299708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.300136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.300175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.301986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.302025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.302493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.302503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.303793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.304293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.304332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.304373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.304636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.304647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.305130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.305170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.305545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.305584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.305931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.305942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.306985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.308938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.309313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.309686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.310040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.310050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.311843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.311883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.312609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.312646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.312956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.312968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.314502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.314546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.314582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.314618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.314920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.314932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.316403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.316443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.318116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.318154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.318465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.318475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.319470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.319513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.319549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.319586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.319871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.319882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.321441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.321481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.322129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.322167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.322484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.322495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.323555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.323598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.323638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.323673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.324065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.184 [2024-07-15 17:41:24.324075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.325752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.325793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.327389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.327427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.327772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.327782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.328782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.328828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.328865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.328900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.329255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.329266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.330396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.330437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.330473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.330509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.330810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.330820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.331813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.331856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.331892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.331928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.332212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.332222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.332313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.332365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.332402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.332442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.332826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.332837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.333720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.333762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.333798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.333834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.334285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.334295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.334389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.334426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.334462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.334498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.334763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.334773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.335721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.335768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.335805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.335841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.336290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.336300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.336392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.336429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.336465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.336500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.336800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.336811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.337706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.337758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.337794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.337833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.338093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.338103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.338195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.338232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.338269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.338308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.338789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.338800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.339984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.340904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.341962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.342005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.342041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.342077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.342515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.342525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.342612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.342649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.342702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.342743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.343007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.343017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.343991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.344981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.345951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.345997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.346033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.346069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.346380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.346391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.346478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.346515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.347163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.347203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.347513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.347523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.348589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.348633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.348669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.348705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.349125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.349136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.350804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.350845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.352451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.352489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.352841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.352851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.353856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.353899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.353935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.353971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.354308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.354319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.355003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.355044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.356272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.356313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.356592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.356602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.357640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.357683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.357723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.357760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.358250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.358260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.360062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.360102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.361889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.361928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.362377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.362390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.363213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.363259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.363306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.363343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.363805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.363816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.364634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.364675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.365763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.365802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.366064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.366074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.367291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.367338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.367374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.367410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.367795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.367805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.369225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.369265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.369301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.370717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.370980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.370990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.372326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.372381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.372758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.372797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.373060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.373071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.373181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.374241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.374281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.375057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.375503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.185 [2024-07-15 17:41:24.375513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.377151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.377196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.378322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.378360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.378773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.378783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.378923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.379296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.379334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.379707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.380181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.380191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.381495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.381540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.381932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.381971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.382370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.382380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.382496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.382883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.382922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.383295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.383836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.383847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.386038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.386082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.386748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.386786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.387168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.387178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.387297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.388618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.388656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.390166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.390487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.390497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.391782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.391827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.392201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.392239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.392546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.392556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.392669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.394200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.394238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.394901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.395175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.395185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.396490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.396537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.398432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.398471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.398738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.398750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.398855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.399804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.399844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.399884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.400187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.400198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.402125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.402169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.403674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.403716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.404084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.404094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.405252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.405293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.406065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.406103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.406459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.406469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.407380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.408249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.408288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.410137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.410577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.410586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.411057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.411098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.411762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.411801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.412064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.412073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.412947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.414715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.416598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.416975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.417398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.417408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.418891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.418932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.420548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.420586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.420850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.420861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.423342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.424778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.424827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.424864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.425327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.425336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.426185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.426226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.427836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.427873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.428202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.428212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.430652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.430779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.430789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.432708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.432751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.433927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.433965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.434429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.434442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.435412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.437032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.438885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.440096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.440441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.440451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.442156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.442197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.442234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.444052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.444493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.444503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.447104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.448965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.449968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.451574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.451856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.451867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.451982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.453453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.453491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.453865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.454338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.454349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.456901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.457762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.459219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.460680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.461136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.461149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.461280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.461851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.461889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.463371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.463635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.463645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.464617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.465587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.465966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.466988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.467296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.467307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.467415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.468823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.468862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.470655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.471078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.471088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.472271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.472313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.472349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.472553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.472563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.472648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.473918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.473957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.475239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.475502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.186 [2024-07-15 17:41:24.475513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.476535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.476922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.477972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.479228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.479490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.479500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.479603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.481238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.481277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.482653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.483006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.483016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.485269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.487053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.488298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.489551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.489818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.489829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.489935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.490321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.490358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.491288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.491593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.491603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.493669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.494859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.495232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.495996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.496296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.496306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.496430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.498127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.499675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.500939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.501263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.501273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.504858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.506659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.507554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.508830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.509093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.509104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.509544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.509924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.511694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.513428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.513781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.513791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.514950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.515329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.517304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.518837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.519216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.519226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.520856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.522807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.523181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.523555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.523822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.523833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.525777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.527277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.529165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.529538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.529985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.529995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.531889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.533282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.534180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.535681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.449 [2024-07-15 17:41:24.535946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.535956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.538092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.539610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.540661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.542473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.542813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.542823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.543407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.543788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.545382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.546991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.547333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.547343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.550090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.552008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.552382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.552758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.553060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.553070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.554721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.556570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.557224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.558836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.559099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.559109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.560419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.562120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.563630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.564283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.564544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.564555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.566579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.566960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.567333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.569297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.569558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.569568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.571433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.571817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.572853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.574464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.574729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.574739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.575805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.577752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.579680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.581565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.582022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.582033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.584445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.586272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.588122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.589154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.589463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.589474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.591371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.593214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.593253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.594095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.594531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.594542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.597104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.598944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.600199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.602140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.602411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.602421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.602538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.604430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.604470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.606179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.606627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.606637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.609298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.610912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.611775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.613382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.613663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.613673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.613783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.615170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.615209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.615589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.616036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.450 [2024-07-15 17:41:24.616046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.617728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.619000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.620578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.620956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.621479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.621492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.621609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.623311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.623352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.625319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.625741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.625752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.626983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.627365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.629174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.630823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.631231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.631242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.631360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.632639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.632678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.633988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.634500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.634513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.637232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.638056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.639650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.641264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.641531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.641542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.641653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.642034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.642504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.642543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.642824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.642835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.645002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.646615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.647371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.647749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.648187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.648197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.649891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.649933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.651121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.651160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.651420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.651431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.652844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.653768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.653807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.654768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.655222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.655232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.656796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.656837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.657220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.657258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.657689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.657702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.658725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.659105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.659144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.659516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.659958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.659969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.661879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.661920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.662292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.662330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.662676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.662686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.663578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.664393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.664434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.665498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.665983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.665994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.666449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.666489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.668336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.668374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.668767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.668778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.669723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.670102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.670140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.671559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.672028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.672043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.674084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.674126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.675913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.675951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.676396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.676406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.677552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.679277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.679317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.680142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.680503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.680513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.682055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.682100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.682472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.682921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.683184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.683194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.684137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.685661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.685700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.686775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.687281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.687291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.687410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.688066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.688105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.689609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.689879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.689891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.690866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.691632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.691672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.692050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.692398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.692408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.692528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.694027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.694065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.694917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.695181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.695193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.696290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.696334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.697104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.697141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.697426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.697437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.697542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.698761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.698824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.700645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.700972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.700982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.702806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.702851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.702890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.702927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.703276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.703287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.703403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.703975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.704014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.705580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.705847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.705858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.706900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.706946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.708744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.710337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.710692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.710702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.710839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.712348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.712386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.713846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.714355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.714366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.715615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.717122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.717161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.717197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.717642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.717652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.717772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.719286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.451 [2024-07-15 17:41:24.720772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.720810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.721333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.721345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.722644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.722691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.722734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.722771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.723031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.723042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.724077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.724118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.725629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.725667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.726118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.726129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.727077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.727120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.727157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.727197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.727591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.727601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.729492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.729533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.731359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.731397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.731854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.731864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.732916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.734426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.734465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.734502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.734898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.734909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.736798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.736839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.738621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.738659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.739117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.739127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.740226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.741935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.452 [2024-07-15 17:41:24.742818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.714 [2024-07-15 17:41:24.744076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.714 [2024-07-15 17:41:24.744340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.714 [2024-07-15 17:41:24.744350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.744801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.744851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.745227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.745266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.745531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.745542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.746870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.746916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.746954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.746990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.747441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.747453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.749205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.749246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.750204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.750242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.750633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.750642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.751758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.751800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.751840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.751876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.752148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.752158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.754105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.754147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.755341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.755378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.755738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.755749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.756851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.756893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.756929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.756965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.757393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.757403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.759325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.759366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.759403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.759438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.759841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.759851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.761774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.762310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.762319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.763294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.763336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.763372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.763407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.763800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.763810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.763902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.763940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.763976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.764012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.764445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.764455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.765621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.765664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.765701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.765744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.766118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.766128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.766219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.766257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.766294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.766330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.766739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.766750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.767778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.767821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.767857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.767896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.768204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.768214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.768302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.768339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.768375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.768414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.768822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.768832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.769654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.769699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.769750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.769798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.770303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.770314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.770460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.770499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.715 [2024-07-15 17:41:24.770536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.770574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.770902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.770913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.771877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.771920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.771957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.771993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.772253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.772264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.772348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.772385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.772422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.772458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.772786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.772797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.774894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.775224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.775234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.776048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.776091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.776128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.776167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.776660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.776672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.776820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.776859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.777754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.777792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.778109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.778119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.779001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.779047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.779084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.779120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.779519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.779532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.779959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.780000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.781385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.781423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.781819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.781829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.782873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.782916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.782951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.782986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.783328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.783337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.784889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.784929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.785301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.785342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.785603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.785612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.786470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.786513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.786549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.786585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.787064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.787075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.787685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.787730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.789329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.789366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.789725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.789738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.790590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.790632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.790668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.790704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.790975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.790986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.792873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.792914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.794302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.794350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.794886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.794897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.795963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.796009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.796045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.796080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.796415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.796425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.798325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.798365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.798401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.799072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.799350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.716 [2024-07-15 17:41:24.799360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.800225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.800268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.800304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.800340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.800732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.800743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.800833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.802007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.802045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.803654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.803921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.803931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.806463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.806509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.808020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.808058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.808448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.808458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.808576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.810186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.810224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.811734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.811998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.812008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.814494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.814540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.816433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.816471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.816891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.816902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.817018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.817464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.817502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.819100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.819362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.819372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.821582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.821627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.823239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.823277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.823539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.823549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.823658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.824036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.824078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.824449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.824719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.824730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.826978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.827024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.828615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.828652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.828989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.829000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.829116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.829489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.829526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.829903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.830311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.830321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:13.717 [2024-07-15 17:41:24.832083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:14.288 00:29:14.288 Latency(us) 00:29:14.288 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:14.288 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:14.288 Verification LBA range: start 0x0 length 0x100 00:29:14.288 crypto_ram : 5.74 44.59 2.79 0.00 0.00 2780260.82 65737.65 2503676.85 00:29:14.288 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:14.288 Verification LBA range: start 0x100 length 0x100 00:29:14.288 crypto_ram : 5.90 43.42 2.71 0.00 0.00 2865909.76 55655.19 2865032.27 00:29:14.288 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:14.288 Verification LBA range: start 0x0 length 0x100 00:29:14.288 crypto_ram2 : 5.74 44.59 2.79 0.00 0.00 2674387.10 65334.35 2503676.85 00:29:14.288 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:14.288 Verification LBA range: start 0x100 length 0x100 00:29:14.288 crypto_ram2 : 5.90 43.41 2.71 0.00 0.00 2748074.14 55251.89 2865032.27 00:29:14.288 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:14.288 Verification LBA range: start 0x0 length 0x100 00:29:14.288 crypto_ram3 : 5.60 299.69 18.73 0.00 0.00 381297.14 42749.64 516222.03 00:29:14.288 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:14.288 Verification LBA range: start 0x100 length 0x100 00:29:14.288 crypto_ram3 : 5.64 253.33 15.83 0.00 0.00 445270.57 8973.39 509769.26 00:29:14.288 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:14.288 Verification LBA range: start 0x0 length 0x100 00:29:14.288 crypto_ram4 : 5.66 314.01 19.63 0.00 0.00 355108.70 6326.74 519448.42 00:29:14.288 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:14.288 Verification LBA range: start 0x100 length 0x100 00:29:14.288 crypto_ram4 : 5.76 273.62 17.10 0.00 0.00 401758.06 3780.92 467826.22 00:29:14.288 =================================================================================================================== 00:29:14.288 Total : 1316.67 82.29 0.00 0.00 718130.98 3780.92 2865032.27 00:29:14.288 00:29:14.288 real 0m8.779s 00:29:14.288 user 0m16.908s 00:29:14.288 sys 0m0.296s 00:29:14.288 17:41:25 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:14.288 17:41:25 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:29:14.288 ************************************ 00:29:14.288 END TEST bdev_verify_big_io 00:29:14.288 ************************************ 00:29:14.548 17:41:25 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:29:14.548 17:41:25 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:14.548 17:41:25 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:29:14.548 17:41:25 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:14.548 17:41:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:14.548 ************************************ 00:29:14.548 START TEST bdev_write_zeroes 00:29:14.548 ************************************ 00:29:14.548 17:41:25 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:14.548 [2024-07-15 17:41:25.692722] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:14.548 [2024-07-15 17:41:25.692777] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2952552 ] 00:29:14.548 [2024-07-15 17:41:25.780753] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:14.808 [2024-07-15 17:41:25.857753] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:14.808 [2024-07-15 17:41:25.878762] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:29:14.808 [2024-07-15 17:41:25.886786] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:14.808 [2024-07-15 17:41:25.894804] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:14.808 [2024-07-15 17:41:25.979480] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:29:17.345 [2024-07-15 17:41:28.145448] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:29:17.345 [2024-07-15 17:41:28.145504] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:17.345 [2024-07-15 17:41:28.145512] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:17.345 [2024-07-15 17:41:28.153464] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:29:17.345 [2024-07-15 17:41:28.153475] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:17.345 [2024-07-15 17:41:28.153481] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:17.345 [2024-07-15 17:41:28.161484] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:29:17.345 [2024-07-15 17:41:28.161494] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:17.345 [2024-07-15 17:41:28.161500] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:17.345 [2024-07-15 17:41:28.169504] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:29:17.345 [2024-07-15 17:41:28.169514] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:17.345 [2024-07-15 17:41:28.169519] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:17.345 Running I/O for 1 seconds... 00:29:18.287 00:29:18.287 Latency(us) 00:29:18.287 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:18.287 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:18.287 crypto_ram : 1.02 2332.64 9.11 0.00 0.00 54475.07 4663.14 64931.05 00:29:18.287 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:18.287 crypto_ram2 : 1.02 2345.86 9.16 0.00 0.00 53956.56 4637.93 60091.47 00:29:18.287 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:18.287 crypto_ram3 : 1.02 18050.24 70.51 0.00 0.00 6996.46 2142.52 8922.98 00:29:18.287 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:18.287 crypto_ram4 : 1.02 18087.41 70.65 0.00 0.00 6963.98 2117.32 7309.78 00:29:18.287 =================================================================================================================== 00:29:18.287 Total : 40816.15 159.44 0.00 0.00 12414.18 2117.32 64931.05 00:29:18.287 00:29:18.287 real 0m3.874s 00:29:18.287 user 0m3.606s 00:29:18.287 sys 0m0.230s 00:29:18.287 17:41:29 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:18.287 17:41:29 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:29:18.287 ************************************ 00:29:18.287 END TEST bdev_write_zeroes 00:29:18.287 ************************************ 00:29:18.287 17:41:29 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:29:18.287 17:41:29 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:18.287 17:41:29 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:29:18.287 17:41:29 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:18.287 17:41:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:18.287 ************************************ 00:29:18.287 START TEST bdev_json_nonenclosed 00:29:18.287 ************************************ 00:29:18.287 17:41:29 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:18.548 [2024-07-15 17:41:29.643562] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:18.548 [2024-07-15 17:41:29.643624] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2953183 ] 00:29:18.548 [2024-07-15 17:41:29.731799] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:18.548 [2024-07-15 17:41:29.808776] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:18.548 [2024-07-15 17:41:29.808837] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:29:18.548 [2024-07-15 17:41:29.808849] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:18.548 [2024-07-15 17:41:29.808856] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:18.808 00:29:18.808 real 0m0.285s 00:29:18.808 user 0m0.173s 00:29:18.808 sys 0m0.111s 00:29:18.808 17:41:29 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:29:18.808 17:41:29 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:18.808 17:41:29 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:29:18.808 ************************************ 00:29:18.808 END TEST bdev_json_nonenclosed 00:29:18.808 ************************************ 00:29:18.808 17:41:29 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:29:18.808 17:41:29 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:29:18.808 17:41:29 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:18.808 17:41:29 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:29:18.808 17:41:29 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:18.808 17:41:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:18.808 ************************************ 00:29:18.808 START TEST bdev_json_nonarray 00:29:18.808 ************************************ 00:29:18.808 17:41:29 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:18.808 [2024-07-15 17:41:29.998987] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:18.808 [2024-07-15 17:41:29.999032] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2953207 ] 00:29:18.808 [2024-07-15 17:41:30.087878] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:19.069 [2024-07-15 17:41:30.162132] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:19.069 [2024-07-15 17:41:30.162197] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:29:19.069 [2024-07-15 17:41:30.162209] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:19.069 [2024-07-15 17:41:30.162216] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:19.069 00:29:19.069 real 0m0.274s 00:29:19.069 user 0m0.167s 00:29:19.069 sys 0m0.105s 00:29:19.069 17:41:30 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:29:19.069 17:41:30 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:19.069 17:41:30 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:29:19.069 ************************************ 00:29:19.069 END TEST bdev_json_nonarray 00:29:19.069 ************************************ 00:29:19.069 17:41:30 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:29:19.069 17:41:30 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:29:19.069 00:29:19.069 real 1m9.066s 00:29:19.069 user 2m49.648s 00:29:19.069 sys 0m6.302s 00:29:19.069 17:41:30 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:19.069 17:41:30 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:19.069 ************************************ 00:29:19.069 END TEST blockdev_crypto_aesni 00:29:19.069 ************************************ 00:29:19.069 17:41:30 -- common/autotest_common.sh@1142 -- # return 0 00:29:19.069 17:41:30 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:29:19.069 17:41:30 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:19.069 17:41:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:19.069 17:41:30 -- common/autotest_common.sh@10 -- # set +x 00:29:19.069 ************************************ 00:29:19.069 START TEST blockdev_crypto_sw 00:29:19.069 ************************************ 00:29:19.069 17:41:30 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:29:19.329 * Looking for test storage... 00:29:19.330 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2953295 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 2953295 00:29:19.330 17:41:30 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 2953295 ']' 00:29:19.330 17:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:29:19.330 17:41:30 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:19.330 17:41:30 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:19.330 17:41:30 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:19.330 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:19.330 17:41:30 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:19.330 17:41:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:19.330 [2024-07-15 17:41:30.522894] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:19.330 [2024-07-15 17:41:30.522946] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2953295 ] 00:29:19.330 [2024-07-15 17:41:30.611549] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:19.590 [2024-07-15 17:41:30.675588] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:20.161 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:20.161 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:29:20.161 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:29:20.161 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:29:20.161 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:29:20.161 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:20.161 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:20.421 Malloc0 00:29:20.421 Malloc1 00:29:20.421 true 00:29:20.421 true 00:29:20.421 true 00:29:20.422 [2024-07-15 17:41:31.543416] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:20.422 crypto_ram 00:29:20.422 [2024-07-15 17:41:31.551439] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:20.422 crypto_ram2 00:29:20.422 [2024-07-15 17:41:31.559462] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:20.422 crypto_ram3 00:29:20.422 [ 00:29:20.422 { 00:29:20.422 "name": "Malloc1", 00:29:20.422 "aliases": [ 00:29:20.422 "fc66d312-bc85-49df-ba37-4c618bcbdb9a" 00:29:20.422 ], 00:29:20.422 "product_name": "Malloc disk", 00:29:20.422 "block_size": 4096, 00:29:20.422 "num_blocks": 4096, 00:29:20.422 "uuid": "fc66d312-bc85-49df-ba37-4c618bcbdb9a", 00:29:20.422 "assigned_rate_limits": { 00:29:20.422 "rw_ios_per_sec": 0, 00:29:20.422 "rw_mbytes_per_sec": 0, 00:29:20.422 "r_mbytes_per_sec": 0, 00:29:20.422 "w_mbytes_per_sec": 0 00:29:20.422 }, 00:29:20.422 "claimed": true, 00:29:20.422 "claim_type": "exclusive_write", 00:29:20.422 "zoned": false, 00:29:20.422 "supported_io_types": { 00:29:20.422 "read": true, 00:29:20.422 "write": true, 00:29:20.422 "unmap": true, 00:29:20.422 "flush": true, 00:29:20.422 "reset": true, 00:29:20.422 "nvme_admin": false, 00:29:20.422 "nvme_io": false, 00:29:20.422 "nvme_io_md": false, 00:29:20.422 "write_zeroes": true, 00:29:20.422 "zcopy": true, 00:29:20.422 "get_zone_info": false, 00:29:20.422 "zone_management": false, 00:29:20.422 "zone_append": false, 00:29:20.422 "compare": false, 00:29:20.422 "compare_and_write": false, 00:29:20.422 "abort": true, 00:29:20.422 "seek_hole": false, 00:29:20.422 "seek_data": false, 00:29:20.422 "copy": true, 00:29:20.422 "nvme_iov_md": false 00:29:20.422 }, 00:29:20.422 "memory_domains": [ 00:29:20.422 { 00:29:20.422 "dma_device_id": "system", 00:29:20.422 "dma_device_type": 1 00:29:20.422 }, 00:29:20.422 { 00:29:20.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:20.422 "dma_device_type": 2 00:29:20.422 } 00:29:20.422 ], 00:29:20.422 "driver_specific": {} 00:29:20.422 } 00:29:20.422 ] 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:20.422 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:29:20.422 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b2148289-6b20-548a-b917-52836aaf72e4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b2148289-6b20-548a-b917-52836aaf72e4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6df59e30-0345-5c41-b840-d93a6c34586f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "6df59e30-0345-5c41-b840-d93a6c34586f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:29:20.682 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:29:20.682 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:29:20.682 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:29:20.683 17:41:31 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 2953295 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 2953295 ']' 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 2953295 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2953295 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2953295' 00:29:20.683 killing process with pid 2953295 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 2953295 00:29:20.683 17:41:31 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 2953295 00:29:20.943 17:41:32 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:20.943 17:41:32 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:20.943 17:41:32 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:20.943 17:41:32 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:20.943 17:41:32 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:20.943 ************************************ 00:29:20.943 START TEST bdev_hello_world 00:29:20.943 ************************************ 00:29:20.943 17:41:32 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:20.943 [2024-07-15 17:41:32.102835] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:20.943 [2024-07-15 17:41:32.102882] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2953599 ] 00:29:20.943 [2024-07-15 17:41:32.192591] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:21.204 [2024-07-15 17:41:32.263693] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:21.204 [2024-07-15 17:41:32.408220] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:21.204 [2024-07-15 17:41:32.408273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:21.204 [2024-07-15 17:41:32.408281] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:21.204 [2024-07-15 17:41:32.416235] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:21.204 [2024-07-15 17:41:32.416246] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:21.204 [2024-07-15 17:41:32.416252] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:21.204 [2024-07-15 17:41:32.424256] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:21.204 [2024-07-15 17:41:32.424266] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:21.204 [2024-07-15 17:41:32.424278] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:21.204 [2024-07-15 17:41:32.461155] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:29:21.204 [2024-07-15 17:41:32.461178] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:29:21.204 [2024-07-15 17:41:32.461188] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:29:21.204 [2024-07-15 17:41:32.462475] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:29:21.204 [2024-07-15 17:41:32.462531] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:29:21.204 [2024-07-15 17:41:32.462539] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:29:21.204 [2024-07-15 17:41:32.462563] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:29:21.204 00:29:21.204 [2024-07-15 17:41:32.462572] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:29:21.465 00:29:21.465 real 0m0.536s 00:29:21.465 user 0m0.362s 00:29:21.465 sys 0m0.155s 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:29:21.465 ************************************ 00:29:21.465 END TEST bdev_hello_world 00:29:21.465 ************************************ 00:29:21.465 17:41:32 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:29:21.465 17:41:32 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:29:21.465 17:41:32 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:21.465 17:41:32 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:21.465 17:41:32 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:21.465 ************************************ 00:29:21.465 START TEST bdev_bounds 00:29:21.465 ************************************ 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2953773 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2953773' 00:29:21.465 Process bdevio pid: 2953773 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2953773 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2953773 ']' 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:21.465 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:21.465 17:41:32 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:21.465 [2024-07-15 17:41:32.720922] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:21.465 [2024-07-15 17:41:32.720977] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2953773 ] 00:29:21.726 [2024-07-15 17:41:32.812327] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:21.726 [2024-07-15 17:41:32.891453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:21.726 [2024-07-15 17:41:32.891596] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:21.726 [2024-07-15 17:41:32.891597] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:21.986 [2024-07-15 17:41:33.030900] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:21.986 [2024-07-15 17:41:33.030953] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:21.986 [2024-07-15 17:41:33.030961] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:21.986 [2024-07-15 17:41:33.038919] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:21.986 [2024-07-15 17:41:33.038930] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:21.986 [2024-07-15 17:41:33.038935] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:21.986 [2024-07-15 17:41:33.046941] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:21.986 [2024-07-15 17:41:33.046950] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:21.986 [2024-07-15 17:41:33.046955] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:22.556 I/O targets: 00:29:22.556 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:29:22.556 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:29:22.556 00:29:22.556 00:29:22.556 CUnit - A unit testing framework for C - Version 2.1-3 00:29:22.556 http://cunit.sourceforge.net/ 00:29:22.556 00:29:22.556 00:29:22.556 Suite: bdevio tests on: crypto_ram3 00:29:22.556 Test: blockdev write read block ...passed 00:29:22.556 Test: blockdev write zeroes read block ...passed 00:29:22.556 Test: blockdev write zeroes read no split ...passed 00:29:22.556 Test: blockdev write zeroes read split ...passed 00:29:22.556 Test: blockdev write zeroes read split partial ...passed 00:29:22.556 Test: blockdev reset ...passed 00:29:22.556 Test: blockdev write read 8 blocks ...passed 00:29:22.556 Test: blockdev write read size > 128k ...passed 00:29:22.556 Test: blockdev write read invalid size ...passed 00:29:22.556 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:22.556 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:22.556 Test: blockdev write read max offset ...passed 00:29:22.556 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:22.556 Test: blockdev writev readv 8 blocks ...passed 00:29:22.556 Test: blockdev writev readv 30 x 1block ...passed 00:29:22.556 Test: blockdev writev readv block ...passed 00:29:22.556 Test: blockdev writev readv size > 128k ...passed 00:29:22.556 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:22.556 Test: blockdev comparev and writev ...passed 00:29:22.556 Test: blockdev nvme passthru rw ...passed 00:29:22.556 Test: blockdev nvme passthru vendor specific ...passed 00:29:22.556 Test: blockdev nvme admin passthru ...passed 00:29:22.556 Test: blockdev copy ...passed 00:29:22.556 Suite: bdevio tests on: crypto_ram 00:29:22.556 Test: blockdev write read block ...passed 00:29:22.556 Test: blockdev write zeroes read block ...passed 00:29:22.556 Test: blockdev write zeroes read no split ...passed 00:29:22.556 Test: blockdev write zeroes read split ...passed 00:29:22.556 Test: blockdev write zeroes read split partial ...passed 00:29:22.556 Test: blockdev reset ...passed 00:29:22.556 Test: blockdev write read 8 blocks ...passed 00:29:22.556 Test: blockdev write read size > 128k ...passed 00:29:22.556 Test: blockdev write read invalid size ...passed 00:29:22.556 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:22.556 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:22.556 Test: blockdev write read max offset ...passed 00:29:22.556 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:22.556 Test: blockdev writev readv 8 blocks ...passed 00:29:22.556 Test: blockdev writev readv 30 x 1block ...passed 00:29:22.556 Test: blockdev writev readv block ...passed 00:29:22.556 Test: blockdev writev readv size > 128k ...passed 00:29:22.556 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:22.556 Test: blockdev comparev and writev ...passed 00:29:22.556 Test: blockdev nvme passthru rw ...passed 00:29:22.556 Test: blockdev nvme passthru vendor specific ...passed 00:29:22.556 Test: blockdev nvme admin passthru ...passed 00:29:22.556 Test: blockdev copy ...passed 00:29:22.556 00:29:22.556 Run Summary: Type Total Ran Passed Failed Inactive 00:29:22.556 suites 2 2 n/a 0 0 00:29:22.556 tests 46 46 46 0 0 00:29:22.556 asserts 260 260 260 0 n/a 00:29:22.556 00:29:22.556 Elapsed time = 0.149 seconds 00:29:22.556 0 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2953773 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2953773 ']' 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2953773 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2953773 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2953773' 00:29:22.556 killing process with pid 2953773 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2953773 00:29:22.556 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2953773 00:29:22.839 17:41:33 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:29:22.839 00:29:22.839 real 0m1.273s 00:29:22.839 user 0m3.448s 00:29:22.839 sys 0m0.268s 00:29:22.839 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:22.839 17:41:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:22.839 ************************************ 00:29:22.839 END TEST bdev_bounds 00:29:22.839 ************************************ 00:29:22.839 17:41:33 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:29:22.839 17:41:33 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:29:22.839 17:41:33 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:29:22.839 17:41:33 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:22.839 17:41:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:22.839 ************************************ 00:29:22.839 START TEST bdev_nbd 00:29:22.839 ************************************ 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2953962 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2953962 /var/tmp/spdk-nbd.sock 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2953962 ']' 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:29:22.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:22.840 17:41:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:22.840 [2024-07-15 17:41:34.082574] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:22.840 [2024-07-15 17:41:34.082639] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:23.101 [2024-07-15 17:41:34.174804] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:23.101 [2024-07-15 17:41:34.242208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:23.101 [2024-07-15 17:41:34.382705] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:23.101 [2024-07-15 17:41:34.382760] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:23.101 [2024-07-15 17:41:34.382768] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:23.101 [2024-07-15 17:41:34.390726] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:23.101 [2024-07-15 17:41:34.390739] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:23.101 [2024-07-15 17:41:34.390744] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:23.101 [2024-07-15 17:41:34.398747] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:23.101 [2024-07-15 17:41:34.398757] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:23.101 [2024-07-15 17:41:34.398763] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:24.075 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:24.645 1+0 records in 00:29:24.645 1+0 records out 00:29:24.645 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025444 s, 16.1 MB/s 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:24.645 17:41:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:24.905 1+0 records in 00:29:24.905 1+0 records out 00:29:24.905 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279817 s, 14.6 MB/s 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:24.905 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:29:25.166 { 00:29:25.166 "nbd_device": "/dev/nbd0", 00:29:25.166 "bdev_name": "crypto_ram" 00:29:25.166 }, 00:29:25.166 { 00:29:25.166 "nbd_device": "/dev/nbd1", 00:29:25.166 "bdev_name": "crypto_ram3" 00:29:25.166 } 00:29:25.166 ]' 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:29:25.166 { 00:29:25.166 "nbd_device": "/dev/nbd0", 00:29:25.166 "bdev_name": "crypto_ram" 00:29:25.166 }, 00:29:25.166 { 00:29:25.166 "nbd_device": "/dev/nbd1", 00:29:25.166 "bdev_name": "crypto_ram3" 00:29:25.166 } 00:29:25.166 ]' 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:25.166 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:25.427 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:25.688 17:41:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:29:25.948 /dev/nbd0 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:25.948 1+0 records in 00:29:25.948 1+0 records out 00:29:25.948 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247979 s, 16.5 MB/s 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:25.948 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:29:26.208 /dev/nbd1 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:26.208 1+0 records in 00:29:26.208 1+0 records out 00:29:26.208 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274203 s, 14.9 MB/s 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:26.208 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:29:26.468 { 00:29:26.468 "nbd_device": "/dev/nbd0", 00:29:26.468 "bdev_name": "crypto_ram" 00:29:26.468 }, 00:29:26.468 { 00:29:26.468 "nbd_device": "/dev/nbd1", 00:29:26.468 "bdev_name": "crypto_ram3" 00:29:26.468 } 00:29:26.468 ]' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:29:26.468 { 00:29:26.468 "nbd_device": "/dev/nbd0", 00:29:26.468 "bdev_name": "crypto_ram" 00:29:26.468 }, 00:29:26.468 { 00:29:26.468 "nbd_device": "/dev/nbd1", 00:29:26.468 "bdev_name": "crypto_ram3" 00:29:26.468 } 00:29:26.468 ]' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:29:26.468 /dev/nbd1' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:29:26.468 /dev/nbd1' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:29:26.468 256+0 records in 00:29:26.468 256+0 records out 00:29:26.468 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0120655 s, 86.9 MB/s 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:29:26.468 256+0 records in 00:29:26.468 256+0 records out 00:29:26.468 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0172292 s, 60.9 MB/s 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:29:26.468 256+0 records in 00:29:26.468 256+0 records out 00:29:26.468 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0306535 s, 34.2 MB/s 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:26.468 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:26.728 17:41:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:26.989 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:29:27.249 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:29:27.508 malloc_lvol_verify 00:29:27.508 17:41:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:29:28.079 3bc61838-aa92-4698-a77d-89479d31d52a 00:29:28.079 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:29:28.079 741ed7b2-fddc-4f6b-a926-b3820e5a288b 00:29:28.079 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:29:28.340 /dev/nbd0 00:29:28.340 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:29:28.340 mke2fs 1.46.5 (30-Dec-2021) 00:29:28.340 Discarding device blocks: 0/4096 done 00:29:28.340 Creating filesystem with 4096 1k blocks and 1024 inodes 00:29:28.340 00:29:28.340 Allocating group tables: 0/1 done 00:29:28.340 Writing inode tables: 0/1 done 00:29:28.340 Creating journal (1024 blocks): done 00:29:28.340 Writing superblocks and filesystem accounting information: 0/1 done 00:29:28.340 00:29:28.340 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:29:28.340 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:29:28.340 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:28.340 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:28.340 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:28.340 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:28.340 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:28.340 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2953962 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2953962 ']' 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2953962 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2953962 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2953962' 00:29:28.601 killing process with pid 2953962 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2953962 00:29:28.601 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2953962 00:29:28.862 17:41:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:29:28.862 00:29:28.862 real 0m5.972s 00:29:28.862 user 0m9.425s 00:29:28.862 sys 0m1.550s 00:29:28.862 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:28.862 17:41:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:28.862 ************************************ 00:29:28.862 END TEST bdev_nbd 00:29:28.862 ************************************ 00:29:28.862 17:41:40 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:29:28.862 17:41:40 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:29:28.862 17:41:40 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:29:28.862 17:41:40 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:29:28.862 17:41:40 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:29:28.862 17:41:40 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:28.862 17:41:40 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:28.862 17:41:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:28.862 ************************************ 00:29:28.862 START TEST bdev_fio 00:29:28.862 ************************************ 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:28.862 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:29:28.862 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:28.863 17:41:40 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:29.124 ************************************ 00:29:29.124 START TEST bdev_fio_rw_verify 00:29:29.124 ************************************ 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:29.124 17:41:40 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:29.385 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:29.385 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:29.385 fio-3.35 00:29:29.385 Starting 2 threads 00:29:41.656 00:29:41.657 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2955461: Mon Jul 15 17:41:51 2024 00:29:41.657 read: IOPS=9314, BW=36.4MiB/s (38.2MB/s)(364MiB/10001msec) 00:29:41.657 slat (usec): min=36, max=742, avg=44.60, stdev=10.86 00:29:41.657 clat (usec): min=18, max=1041, avg=335.69, stdev=131.25 00:29:41.657 lat (usec): min=56, max=1361, avg=380.30, stdev=134.15 00:29:41.657 clat percentiles (usec): 00:29:41.657 | 50.000th=[ 326], 99.000th=[ 627], 99.900th=[ 701], 99.990th=[ 799], 00:29:41.657 | 99.999th=[ 1045] 00:29:41.657 write: IOPS=11.2k, BW=43.7MiB/s (45.8MB/s)(415MiB/9502msec); 0 zone resets 00:29:41.657 slat (usec): min=35, max=2480, avg=79.37, stdev=14.51 00:29:41.657 clat (usec): min=66, max=3162, avg=460.30, stdev=209.64 00:29:41.657 lat (usec): min=135, max=3233, avg=539.68, stdev=213.42 00:29:41.657 clat percentiles (usec): 00:29:41.657 | 50.000th=[ 449], 99.000th=[ 906], 99.900th=[ 979], 99.990th=[ 1057], 00:29:41.657 | 99.999th=[ 3064] 00:29:41.657 bw ( KiB/s): min=36760, max=48568, per=95.55%, avg=42722.53, stdev=1774.42, samples=38 00:29:41.657 iops : min= 9190, max=12142, avg=10680.63, stdev=443.60, samples=38 00:29:41.657 lat (usec) : 20=0.01%, 50=0.01%, 100=0.02%, 250=22.44%, 500=48.97% 00:29:41.657 lat (usec) : 750=22.58%, 1000=5.96% 00:29:41.657 lat (msec) : 2=0.02%, 4=0.01% 00:29:41.657 cpu : usr=99.34%, sys=0.01%, ctx=36, majf=0, minf=451 00:29:41.657 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:41.657 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:41.657 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:41.657 issued rwts: total=93153,106212,0,0 short=0,0,0,0 dropped=0,0,0,0 00:29:41.657 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:41.657 00:29:41.657 Run status group 0 (all jobs): 00:29:41.657 READ: bw=36.4MiB/s (38.2MB/s), 36.4MiB/s-36.4MiB/s (38.2MB/s-38.2MB/s), io=364MiB (382MB), run=10001-10001msec 00:29:41.657 WRITE: bw=43.7MiB/s (45.8MB/s), 43.7MiB/s-43.7MiB/s (45.8MB/s-45.8MB/s), io=415MiB (435MB), run=9502-9502msec 00:29:41.657 00:29:41.657 real 0m11.057s 00:29:41.657 user 0m27.087s 00:29:41.657 sys 0m0.333s 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:29:41.657 ************************************ 00:29:41.657 END TEST bdev_fio_rw_verify 00:29:41.657 ************************************ 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b2148289-6b20-548a-b917-52836aaf72e4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b2148289-6b20-548a-b917-52836aaf72e4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6df59e30-0345-5c41-b840-d93a6c34586f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "6df59e30-0345-5c41-b840-d93a6c34586f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:29:41.657 crypto_ram3 ]] 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b2148289-6b20-548a-b917-52836aaf72e4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b2148289-6b20-548a-b917-52836aaf72e4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6df59e30-0345-5c41-b840-d93a6c34586f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "6df59e30-0345-5c41-b840-d93a6c34586f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:41.657 ************************************ 00:29:41.657 START TEST bdev_fio_trim 00:29:41.657 ************************************ 00:29:41.657 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:41.658 17:41:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:41.658 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:41.658 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:41.658 fio-3.35 00:29:41.658 Starting 2 threads 00:29:51.652 00:29:51.652 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2957457: Mon Jul 15 17:42:02 2024 00:29:51.652 write: IOPS=56.4k, BW=220MiB/s (231MB/s)(2204MiB/10001msec); 0 zone resets 00:29:51.652 slat (usec): min=10, max=439, avg=15.06, stdev= 4.53 00:29:51.652 clat (usec): min=26, max=1628, avg=117.87, stdev=65.32 00:29:51.652 lat (usec): min=37, max=1656, avg=132.93, stdev=67.73 00:29:51.652 clat percentiles (usec): 00:29:51.652 | 50.000th=[ 94], 99.000th=[ 253], 99.900th=[ 285], 99.990th=[ 474], 00:29:51.652 | 99.999th=[ 635] 00:29:51.652 bw ( KiB/s): min=207336, max=232128, per=99.93%, avg=225537.26, stdev=4867.14, samples=38 00:29:51.652 iops : min=51834, max=58032, avg=56384.32, stdev=1216.79, samples=38 00:29:51.652 trim: IOPS=56.4k, BW=220MiB/s (231MB/s)(2204MiB/10001msec); 0 zone resets 00:29:51.652 slat (nsec): min=4735, max=97064, avg=6992.37, stdev=2325.07 00:29:51.652 clat (usec): min=20, max=1494, avg=78.76, stdev=24.28 00:29:51.652 lat (usec): min=25, max=1502, avg=85.76, stdev=24.38 00:29:51.652 clat percentiles (usec): 00:29:51.652 | 50.000th=[ 79], 99.000th=[ 135], 99.900th=[ 157], 99.990th=[ 262], 00:29:51.652 | 99.999th=[ 404] 00:29:51.652 bw ( KiB/s): min=207360, max=232128, per=99.93%, avg=225538.95, stdev=4865.95, samples=38 00:29:51.652 iops : min=51840, max=58032, avg=56384.74, stdev=1216.49, samples=38 00:29:51.652 lat (usec) : 50=14.56%, 100=52.01%, 250=32.86%, 500=0.57%, 750=0.01% 00:29:51.652 lat (msec) : 2=0.01% 00:29:51.652 cpu : usr=99.69%, sys=0.01%, ctx=20, majf=0, minf=342 00:29:51.652 IO depths : 1=7.5%, 2=17.5%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:51.652 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:51.652 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:51.652 issued rwts: total=0,564317,564319,0 short=0,0,0,0 dropped=0,0,0,0 00:29:51.652 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:51.652 00:29:51.652 Run status group 0 (all jobs): 00:29:51.652 WRITE: bw=220MiB/s (231MB/s), 220MiB/s-220MiB/s (231MB/s-231MB/s), io=2204MiB (2311MB), run=10001-10001msec 00:29:51.652 TRIM: bw=220MiB/s (231MB/s), 220MiB/s-220MiB/s (231MB/s-231MB/s), io=2204MiB (2311MB), run=10001-10001msec 00:29:51.652 00:29:51.652 real 0m11.000s 00:29:51.652 user 0m27.619s 00:29:51.652 sys 0m0.320s 00:29:51.652 17:42:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:51.652 17:42:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:29:51.652 ************************************ 00:29:51.652 END TEST bdev_fio_trim 00:29:51.652 ************************************ 00:29:51.652 17:42:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:29:51.653 17:42:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:29:51.653 17:42:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:51.653 17:42:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:29:51.653 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:51.653 17:42:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:29:51.653 00:29:51.653 real 0m22.432s 00:29:51.653 user 0m54.911s 00:29:51.653 sys 0m0.839s 00:29:51.653 17:42:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:51.653 17:42:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:51.653 ************************************ 00:29:51.653 END TEST bdev_fio 00:29:51.653 ************************************ 00:29:51.653 17:42:02 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:29:51.653 17:42:02 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:51.653 17:42:02 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:51.653 17:42:02 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:29:51.653 17:42:02 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:51.653 17:42:02 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:51.653 ************************************ 00:29:51.653 START TEST bdev_verify 00:29:51.653 ************************************ 00:29:51.653 17:42:02 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:51.653 [2024-07-15 17:42:02.630996] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:51.653 [2024-07-15 17:42:02.631058] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2959129 ] 00:29:51.653 [2024-07-15 17:42:02.724256] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:51.653 [2024-07-15 17:42:02.817185] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:51.653 [2024-07-15 17:42:02.817190] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:51.914 [2024-07-15 17:42:02.981737] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:51.914 [2024-07-15 17:42:02.981812] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:51.914 [2024-07-15 17:42:02.981823] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.914 [2024-07-15 17:42:02.989757] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:51.914 [2024-07-15 17:42:02.989770] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:51.914 [2024-07-15 17:42:02.989776] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.914 [2024-07-15 17:42:02.997769] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:51.914 [2024-07-15 17:42:02.997781] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:51.914 [2024-07-15 17:42:02.997787] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.914 Running I/O for 5 seconds... 00:29:57.200 00:29:57.200 Latency(us) 00:29:57.200 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:57.200 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:57.200 Verification LBA range: start 0x0 length 0x800 00:29:57.200 crypto_ram : 5.01 7668.41 29.95 0.00 0.00 16620.80 1190.99 21576.47 00:29:57.200 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:57.200 Verification LBA range: start 0x800 length 0x800 00:29:57.200 crypto_ram : 5.01 6410.35 25.04 0.00 0.00 19880.56 1474.56 25710.28 00:29:57.200 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:57.200 Verification LBA range: start 0x0 length 0x800 00:29:57.200 crypto_ram3 : 5.02 3850.95 15.04 0.00 0.00 33045.52 1518.67 26214.40 00:29:57.200 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:57.200 Verification LBA range: start 0x800 length 0x800 00:29:57.200 crypto_ram3 : 5.03 3232.13 12.63 0.00 0.00 39367.76 1714.02 29037.49 00:29:57.200 =================================================================================================================== 00:29:57.200 Total : 21161.84 82.66 0.00 0.00 24084.25 1190.99 29037.49 00:29:57.200 00:29:57.200 real 0m5.644s 00:29:57.200 user 0m10.725s 00:29:57.200 sys 0m0.196s 00:29:57.200 17:42:08 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:57.200 17:42:08 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:29:57.200 ************************************ 00:29:57.200 END TEST bdev_verify 00:29:57.201 ************************************ 00:29:57.201 17:42:08 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:29:57.201 17:42:08 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:57.201 17:42:08 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:29:57.201 17:42:08 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:57.201 17:42:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:57.201 ************************************ 00:29:57.201 START TEST bdev_verify_big_io 00:29:57.201 ************************************ 00:29:57.201 17:42:08 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:57.201 [2024-07-15 17:42:08.347654] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:29:57.201 [2024-07-15 17:42:08.347700] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2960063 ] 00:29:57.201 [2024-07-15 17:42:08.436602] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:57.461 [2024-07-15 17:42:08.514414] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:57.461 [2024-07-15 17:42:08.514419] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:57.461 [2024-07-15 17:42:08.654370] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:57.461 [2024-07-15 17:42:08.654409] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:57.461 [2024-07-15 17:42:08.654417] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:57.461 [2024-07-15 17:42:08.662389] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:57.461 [2024-07-15 17:42:08.662399] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:57.461 [2024-07-15 17:42:08.662405] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:57.461 [2024-07-15 17:42:08.670410] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:57.461 [2024-07-15 17:42:08.670420] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:57.461 [2024-07-15 17:42:08.670425] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:57.461 Running I/O for 5 seconds... 00:30:02.747 00:30:02.747 Latency(us) 00:30:02.747 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:02.747 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:02.747 Verification LBA range: start 0x0 length 0x80 00:30:02.747 crypto_ram : 5.10 476.53 29.78 0.00 0.00 262024.37 3755.72 372647.78 00:30:02.747 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:02.747 Verification LBA range: start 0x80 length 0x80 00:30:02.747 crypto_ram : 5.26 462.39 28.90 0.00 0.00 270604.42 4310.25 369421.39 00:30:02.747 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:02.747 Verification LBA range: start 0x0 length 0x80 00:30:02.747 crypto_ram3 : 5.27 267.13 16.70 0.00 0.00 450066.50 3629.69 387166.52 00:30:02.747 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:02.747 Verification LBA range: start 0x80 length 0x80 00:30:02.747 crypto_ram3 : 5.27 242.86 15.18 0.00 0.00 497728.51 4234.63 380713.75 00:30:02.747 =================================================================================================================== 00:30:02.747 Total : 1448.90 90.56 0.00 0.00 339795.99 3629.69 387166.52 00:30:03.008 00:30:03.008 real 0m5.840s 00:30:03.008 user 0m11.193s 00:30:03.008 sys 0m0.159s 00:30:03.008 17:42:14 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:03.008 17:42:14 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:30:03.008 ************************************ 00:30:03.008 END TEST bdev_verify_big_io 00:30:03.008 ************************************ 00:30:03.008 17:42:14 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:30:03.008 17:42:14 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:03.008 17:42:14 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:30:03.008 17:42:14 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:03.008 17:42:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:03.008 ************************************ 00:30:03.008 START TEST bdev_write_zeroes 00:30:03.008 ************************************ 00:30:03.008 17:42:14 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:03.008 [2024-07-15 17:42:14.260718] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:30:03.008 [2024-07-15 17:42:14.260763] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2961455 ] 00:30:03.271 [2024-07-15 17:42:14.347491] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:03.271 [2024-07-15 17:42:14.424802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:03.271 [2024-07-15 17:42:14.563901] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:03.271 [2024-07-15 17:42:14.563946] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:03.271 [2024-07-15 17:42:14.563954] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:03.565 [2024-07-15 17:42:14.571919] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:30:03.565 [2024-07-15 17:42:14.571932] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:03.565 [2024-07-15 17:42:14.571937] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:03.565 [2024-07-15 17:42:14.579940] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:30:03.565 [2024-07-15 17:42:14.579951] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:30:03.565 [2024-07-15 17:42:14.579956] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:03.565 Running I/O for 1 seconds... 00:30:04.507 00:30:04.507 Latency(us) 00:30:04.507 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:04.507 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:04.507 crypto_ram : 1.01 32546.71 127.14 0.00 0.00 3924.35 1014.55 5494.94 00:30:04.507 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:04.507 crypto_ram3 : 1.01 16245.25 63.46 0.00 0.00 7828.33 4940.41 8116.38 00:30:04.507 =================================================================================================================== 00:30:04.507 Total : 48791.96 190.59 0.00 0.00 5225.68 1014.55 8116.38 00:30:04.507 00:30:04.507 real 0m1.545s 00:30:04.507 user 0m1.380s 00:30:04.507 sys 0m0.147s 00:30:04.507 17:42:15 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:04.507 17:42:15 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:30:04.507 ************************************ 00:30:04.507 END TEST bdev_write_zeroes 00:30:04.507 ************************************ 00:30:04.507 17:42:15 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:30:04.507 17:42:15 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:04.507 17:42:15 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:30:04.507 17:42:15 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:04.507 17:42:15 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:04.767 ************************************ 00:30:04.767 START TEST bdev_json_nonenclosed 00:30:04.768 ************************************ 00:30:04.768 17:42:15 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:04.768 [2024-07-15 17:42:15.914145] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:30:04.768 [2024-07-15 17:42:15.914271] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2961769 ] 00:30:04.768 [2024-07-15 17:42:16.057408] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:05.053 [2024-07-15 17:42:16.132673] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:05.053 [2024-07-15 17:42:16.132733] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:30:05.053 [2024-07-15 17:42:16.132744] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:30:05.053 [2024-07-15 17:42:16.132751] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:30:05.053 00:30:05.053 real 0m0.375s 00:30:05.053 user 0m0.222s 00:30:05.053 sys 0m0.150s 00:30:05.053 17:42:16 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:30:05.053 17:42:16 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:05.053 17:42:16 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:30:05.053 ************************************ 00:30:05.053 END TEST bdev_json_nonenclosed 00:30:05.053 ************************************ 00:30:05.053 17:42:16 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:30:05.053 17:42:16 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:30:05.053 17:42:16 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:05.053 17:42:16 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:30:05.053 17:42:16 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:05.053 17:42:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:05.053 ************************************ 00:30:05.053 START TEST bdev_json_nonarray 00:30:05.053 ************************************ 00:30:05.053 17:42:16 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:05.314 [2024-07-15 17:42:16.362478] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:30:05.314 [2024-07-15 17:42:16.362605] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2961811 ] 00:30:05.314 [2024-07-15 17:42:16.504869] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:05.314 [2024-07-15 17:42:16.582600] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:05.314 [2024-07-15 17:42:16.582664] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:30:05.314 [2024-07-15 17:42:16.582678] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:30:05.314 [2024-07-15 17:42:16.582685] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:30:05.576 00:30:05.576 real 0m0.379s 00:30:05.576 user 0m0.221s 00:30:05.576 sys 0m0.155s 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:30:05.576 ************************************ 00:30:05.576 END TEST bdev_json_nonarray 00:30:05.576 ************************************ 00:30:05.576 17:42:16 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:30:05.576 17:42:16 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:30:05.576 17:42:16 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:30:05.576 17:42:16 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:30:05.576 17:42:16 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:30:05.576 17:42:16 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:30:05.576 17:42:16 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:05.576 17:42:16 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:05.576 17:42:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:05.576 ************************************ 00:30:05.576 START TEST bdev_crypto_enomem 00:30:05.576 ************************************ 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=2961922 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 2961922 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 2961922 ']' 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:05.576 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:05.576 17:42:16 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:05.576 [2024-07-15 17:42:16.789410] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:30:05.576 [2024-07-15 17:42:16.789473] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2961922 ] 00:30:05.838 [2024-07-15 17:42:16.874276] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:05.838 [2024-07-15 17:42:16.974098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:06.410 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:06.411 true 00:30:06.411 base0 00:30:06.411 true 00:30:06.411 [2024-07-15 17:42:17.687112] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:06.411 crypt0 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:06.411 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:06.673 [ 00:30:06.673 { 00:30:06.673 "name": "crypt0", 00:30:06.673 "aliases": [ 00:30:06.673 "c8fc6756-222b-5c68-9e49-24db79dfabb5" 00:30:06.673 ], 00:30:06.673 "product_name": "crypto", 00:30:06.673 "block_size": 512, 00:30:06.673 "num_blocks": 2097152, 00:30:06.673 "uuid": "c8fc6756-222b-5c68-9e49-24db79dfabb5", 00:30:06.673 "assigned_rate_limits": { 00:30:06.673 "rw_ios_per_sec": 0, 00:30:06.673 "rw_mbytes_per_sec": 0, 00:30:06.673 "r_mbytes_per_sec": 0, 00:30:06.673 "w_mbytes_per_sec": 0 00:30:06.673 }, 00:30:06.673 "claimed": false, 00:30:06.673 "zoned": false, 00:30:06.673 "supported_io_types": { 00:30:06.673 "read": true, 00:30:06.673 "write": true, 00:30:06.673 "unmap": false, 00:30:06.673 "flush": false, 00:30:06.673 "reset": true, 00:30:06.673 "nvme_admin": false, 00:30:06.673 "nvme_io": false, 00:30:06.673 "nvme_io_md": false, 00:30:06.673 "write_zeroes": true, 00:30:06.673 "zcopy": false, 00:30:06.673 "get_zone_info": false, 00:30:06.673 "zone_management": false, 00:30:06.673 "zone_append": false, 00:30:06.673 "compare": false, 00:30:06.673 "compare_and_write": false, 00:30:06.673 "abort": false, 00:30:06.673 "seek_hole": false, 00:30:06.673 "seek_data": false, 00:30:06.673 "copy": false, 00:30:06.673 "nvme_iov_md": false 00:30:06.673 }, 00:30:06.673 "memory_domains": [ 00:30:06.673 { 00:30:06.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:06.673 "dma_device_type": 2 00:30:06.673 } 00:30:06.673 ], 00:30:06.673 "driver_specific": { 00:30:06.673 "crypto": { 00:30:06.673 "base_bdev_name": "EE_base0", 00:30:06.673 "name": "crypt0", 00:30:06.673 "key_name": "test_dek_sw" 00:30:06.673 } 00:30:06.673 } 00:30:06.673 } 00:30:06.673 ] 00:30:06.673 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:06.673 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:30:06.673 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=2962116 00:30:06.673 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:30:06.673 17:42:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:06.673 Running I/O for 5 seconds... 00:30:07.619 17:42:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:30:07.619 17:42:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:07.619 17:42:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:07.619 17:42:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:07.619 17:42:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 2962116 00:30:11.826 00:30:11.826 Latency(us) 00:30:11.826 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:11.826 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:30:11.826 crypt0 : 5.00 35968.98 140.50 0.00 0.00 885.88 441.11 1411.54 00:30:11.826 =================================================================================================================== 00:30:11.826 Total : 35968.98 140.50 0.00 0.00 885.88 441.11 1411.54 00:30:11.826 0 00:30:11.826 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:30:11.826 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 2961922 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 2961922 ']' 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 2961922 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2961922 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2961922' 00:30:11.827 killing process with pid 2961922 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 2961922 00:30:11.827 Received shutdown signal, test time was about 5.000000 seconds 00:30:11.827 00:30:11.827 Latency(us) 00:30:11.827 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:11.827 =================================================================================================================== 00:30:11.827 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:11.827 17:42:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 2961922 00:30:11.827 17:42:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:30:11.827 00:30:11.827 real 0m6.358s 00:30:11.827 user 0m6.610s 00:30:11.827 sys 0m0.305s 00:30:11.827 17:42:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:11.827 17:42:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:11.827 ************************************ 00:30:11.827 END TEST bdev_crypto_enomem 00:30:11.827 ************************************ 00:30:11.827 17:42:23 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:30:11.827 17:42:23 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:30:11.827 17:42:23 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:30:11.827 17:42:23 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:30:12.087 17:42:23 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:12.087 17:42:23 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:30:12.087 17:42:23 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:30:12.087 17:42:23 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:30:12.087 17:42:23 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:30:12.087 00:30:12.087 real 0m52.780s 00:30:12.087 user 1m40.613s 00:30:12.087 sys 0m4.868s 00:30:12.087 17:42:23 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:12.087 17:42:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:12.087 ************************************ 00:30:12.087 END TEST blockdev_crypto_sw 00:30:12.087 ************************************ 00:30:12.087 17:42:23 -- common/autotest_common.sh@1142 -- # return 0 00:30:12.087 17:42:23 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:30:12.087 17:42:23 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:12.087 17:42:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:12.087 17:42:23 -- common/autotest_common.sh@10 -- # set +x 00:30:12.087 ************************************ 00:30:12.087 START TEST blockdev_crypto_qat 00:30:12.087 ************************************ 00:30:12.087 17:42:23 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:30:12.087 * Looking for test storage... 00:30:12.087 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2963096 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 2963096 00:30:12.087 17:42:23 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:30:12.087 17:42:23 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 2963096 ']' 00:30:12.087 17:42:23 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:12.087 17:42:23 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:12.087 17:42:23 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:12.087 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:12.087 17:42:23 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:12.087 17:42:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:12.349 [2024-07-15 17:42:23.393608] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:30:12.349 [2024-07-15 17:42:23.393682] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2963096 ] 00:30:12.349 [2024-07-15 17:42:23.484993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:12.349 [2024-07-15 17:42:23.577810] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:13.733 17:42:24 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:13.733 17:42:24 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:30:13.733 17:42:24 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:30:13.733 17:42:24 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:30:13.733 17:42:24 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:30:13.733 17:42:24 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:13.733 17:42:24 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:13.733 [2024-07-15 17:42:24.608727] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:13.733 [2024-07-15 17:42:24.616754] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:13.733 [2024-07-15 17:42:24.624769] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:13.733 [2024-07-15 17:42:24.690230] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:16.280 true 00:30:16.280 true 00:30:16.280 true 00:30:16.280 true 00:30:16.280 Malloc0 00:30:16.280 Malloc1 00:30:16.280 Malloc2 00:30:16.280 Malloc3 00:30:16.280 [2024-07-15 17:42:27.100427] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:16.280 crypto_ram 00:30:16.280 [2024-07-15 17:42:27.108449] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:16.280 crypto_ram1 00:30:16.280 [2024-07-15 17:42:27.116468] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:16.280 crypto_ram2 00:30:16.280 [2024-07-15 17:42:27.124489] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:16.280 crypto_ram3 00:30:16.280 [ 00:30:16.280 { 00:30:16.280 "name": "Malloc1", 00:30:16.280 "aliases": [ 00:30:16.280 "d7f6a5c3-2749-456f-8681-ae614e5103bc" 00:30:16.280 ], 00:30:16.280 "product_name": "Malloc disk", 00:30:16.280 "block_size": 512, 00:30:16.280 "num_blocks": 65536, 00:30:16.280 "uuid": "d7f6a5c3-2749-456f-8681-ae614e5103bc", 00:30:16.280 "assigned_rate_limits": { 00:30:16.280 "rw_ios_per_sec": 0, 00:30:16.280 "rw_mbytes_per_sec": 0, 00:30:16.280 "r_mbytes_per_sec": 0, 00:30:16.280 "w_mbytes_per_sec": 0 00:30:16.280 }, 00:30:16.280 "claimed": true, 00:30:16.280 "claim_type": "exclusive_write", 00:30:16.280 "zoned": false, 00:30:16.280 "supported_io_types": { 00:30:16.280 "read": true, 00:30:16.280 "write": true, 00:30:16.280 "unmap": true, 00:30:16.280 "flush": true, 00:30:16.280 "reset": true, 00:30:16.280 "nvme_admin": false, 00:30:16.280 "nvme_io": false, 00:30:16.280 "nvme_io_md": false, 00:30:16.280 "write_zeroes": true, 00:30:16.280 "zcopy": true, 00:30:16.280 "get_zone_info": false, 00:30:16.280 "zone_management": false, 00:30:16.280 "zone_append": false, 00:30:16.280 "compare": false, 00:30:16.280 "compare_and_write": false, 00:30:16.280 "abort": true, 00:30:16.280 "seek_hole": false, 00:30:16.280 "seek_data": false, 00:30:16.280 "copy": true, 00:30:16.280 "nvme_iov_md": false 00:30:16.280 }, 00:30:16.280 "memory_domains": [ 00:30:16.280 { 00:30:16.280 "dma_device_id": "system", 00:30:16.280 "dma_device_type": 1 00:30:16.280 }, 00:30:16.280 { 00:30:16.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:16.280 "dma_device_type": 2 00:30:16.280 } 00:30:16.280 ], 00:30:16.280 "driver_specific": {} 00:30:16.280 } 00:30:16.280 ] 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f6c90005-3501-525f-9719-1480eb7797e2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f6c90005-3501-525f-9719-1480eb7797e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "b26bf13b-5782-5965-b0af-85031dbcb35b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b26bf13b-5782-5965-b0af-85031dbcb35b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2c412943-7e80-5fdf-8e56-180ea0338620"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "2c412943-7e80-5fdf-8e56-180ea0338620",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e65f0c05-44e8-5cf7-a585-2c2d9efeab1b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "e65f0c05-44e8-5cf7-a585-2c2d9efeab1b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:30:16.280 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 2963096 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 2963096 ']' 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 2963096 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2963096 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2963096' 00:30:16.280 killing process with pid 2963096 00:30:16.280 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 2963096 00:30:16.281 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 2963096 00:30:16.541 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:16.541 17:42:27 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:16.541 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:30:16.541 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:16.541 17:42:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:16.541 ************************************ 00:30:16.541 START TEST bdev_hello_world 00:30:16.541 ************************************ 00:30:16.541 17:42:27 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:16.541 [2024-07-15 17:42:27.838663] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:30:16.541 [2024-07-15 17:42:27.838718] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2963728 ] 00:30:16.802 [2024-07-15 17:42:27.924552] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:16.802 [2024-07-15 17:42:27.997599] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:16.802 [2024-07-15 17:42:28.018612] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:16.802 [2024-07-15 17:42:28.026636] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:16.802 [2024-07-15 17:42:28.034652] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:17.062 [2024-07-15 17:42:28.122771] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:19.602 [2024-07-15 17:42:30.279006] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:19.602 [2024-07-15 17:42:30.279059] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:19.602 [2024-07-15 17:42:30.279068] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:19.602 [2024-07-15 17:42:30.287024] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:19.602 [2024-07-15 17:42:30.287035] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:19.602 [2024-07-15 17:42:30.287041] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:19.602 [2024-07-15 17:42:30.295043] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:19.602 [2024-07-15 17:42:30.295054] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:19.602 [2024-07-15 17:42:30.295060] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:19.602 [2024-07-15 17:42:30.303063] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:19.602 [2024-07-15 17:42:30.303074] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:19.602 [2024-07-15 17:42:30.303079] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:19.602 [2024-07-15 17:42:30.364432] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:30:19.602 [2024-07-15 17:42:30.364461] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:30:19.602 [2024-07-15 17:42:30.364471] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:30:19.602 [2024-07-15 17:42:30.365496] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:30:19.602 [2024-07-15 17:42:30.365546] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:30:19.602 [2024-07-15 17:42:30.365555] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:30:19.602 [2024-07-15 17:42:30.365592] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:30:19.602 00:30:19.602 [2024-07-15 17:42:30.365603] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:30:19.602 00:30:19.602 real 0m2.817s 00:30:19.602 user 0m2.550s 00:30:19.602 sys 0m0.229s 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:30:19.602 ************************************ 00:30:19.602 END TEST bdev_hello_world 00:30:19.602 ************************************ 00:30:19.602 17:42:30 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:30:19.602 17:42:30 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:30:19.602 17:42:30 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:19.602 17:42:30 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:19.602 17:42:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:19.602 ************************************ 00:30:19.602 START TEST bdev_bounds 00:30:19.602 ************************************ 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2964346 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2964346' 00:30:19.602 Process bdevio pid: 2964346 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2964346 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2964346 ']' 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:19.602 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:19.602 17:42:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:19.602 [2024-07-15 17:42:30.718223] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:30:19.602 [2024-07-15 17:42:30.718268] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2964346 ] 00:30:19.602 [2024-07-15 17:42:30.804395] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:19.602 [2024-07-15 17:42:30.869388] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:19.602 [2024-07-15 17:42:30.869533] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:19.602 [2024-07-15 17:42:30.869534] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:19.602 [2024-07-15 17:42:30.890533] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:19.602 [2024-07-15 17:42:30.898559] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:19.862 [2024-07-15 17:42:30.906581] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:19.862 [2024-07-15 17:42:30.990165] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:22.403 [2024-07-15 17:42:33.142117] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:22.403 [2024-07-15 17:42:33.142169] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:22.403 [2024-07-15 17:42:33.142181] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:22.403 [2024-07-15 17:42:33.150134] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:22.403 [2024-07-15 17:42:33.150145] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:22.403 [2024-07-15 17:42:33.150151] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:22.403 [2024-07-15 17:42:33.158157] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:22.403 [2024-07-15 17:42:33.158167] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:22.403 [2024-07-15 17:42:33.158173] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:22.403 [2024-07-15 17:42:33.166177] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:22.403 [2024-07-15 17:42:33.166187] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:22.403 [2024-07-15 17:42:33.166193] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:22.403 17:42:33 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:22.403 17:42:33 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:30:22.403 17:42:33 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:22.403 I/O targets: 00:30:22.403 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:30:22.403 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:30:22.403 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:30:22.403 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:30:22.403 00:30:22.403 00:30:22.403 CUnit - A unit testing framework for C - Version 2.1-3 00:30:22.403 http://cunit.sourceforge.net/ 00:30:22.403 00:30:22.403 00:30:22.403 Suite: bdevio tests on: crypto_ram3 00:30:22.403 Test: blockdev write read block ...passed 00:30:22.403 Test: blockdev write zeroes read block ...passed 00:30:22.403 Test: blockdev write zeroes read no split ...passed 00:30:22.403 Test: blockdev write zeroes read split ...passed 00:30:22.403 Test: blockdev write zeroes read split partial ...passed 00:30:22.403 Test: blockdev reset ...passed 00:30:22.403 Test: blockdev write read 8 blocks ...passed 00:30:22.403 Test: blockdev write read size > 128k ...passed 00:30:22.403 Test: blockdev write read invalid size ...passed 00:30:22.403 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:22.403 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:22.403 Test: blockdev write read max offset ...passed 00:30:22.403 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:22.403 Test: blockdev writev readv 8 blocks ...passed 00:30:22.403 Test: blockdev writev readv 30 x 1block ...passed 00:30:22.403 Test: blockdev writev readv block ...passed 00:30:22.403 Test: blockdev writev readv size > 128k ...passed 00:30:22.403 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:22.403 Test: blockdev comparev and writev ...passed 00:30:22.403 Test: blockdev nvme passthru rw ...passed 00:30:22.403 Test: blockdev nvme passthru vendor specific ...passed 00:30:22.403 Test: blockdev nvme admin passthru ...passed 00:30:22.403 Test: blockdev copy ...passed 00:30:22.403 Suite: bdevio tests on: crypto_ram2 00:30:22.403 Test: blockdev write read block ...passed 00:30:22.403 Test: blockdev write zeroes read block ...passed 00:30:22.403 Test: blockdev write zeroes read no split ...passed 00:30:22.403 Test: blockdev write zeroes read split ...passed 00:30:22.403 Test: blockdev write zeroes read split partial ...passed 00:30:22.403 Test: blockdev reset ...passed 00:30:22.403 Test: blockdev write read 8 blocks ...passed 00:30:22.403 Test: blockdev write read size > 128k ...passed 00:30:22.403 Test: blockdev write read invalid size ...passed 00:30:22.403 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:22.403 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:22.403 Test: blockdev write read max offset ...passed 00:30:22.403 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:22.403 Test: blockdev writev readv 8 blocks ...passed 00:30:22.403 Test: blockdev writev readv 30 x 1block ...passed 00:30:22.403 Test: blockdev writev readv block ...passed 00:30:22.403 Test: blockdev writev readv size > 128k ...passed 00:30:22.403 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:22.403 Test: blockdev comparev and writev ...passed 00:30:22.403 Test: blockdev nvme passthru rw ...passed 00:30:22.403 Test: blockdev nvme passthru vendor specific ...passed 00:30:22.403 Test: blockdev nvme admin passthru ...passed 00:30:22.403 Test: blockdev copy ...passed 00:30:22.403 Suite: bdevio tests on: crypto_ram1 00:30:22.403 Test: blockdev write read block ...passed 00:30:22.403 Test: blockdev write zeroes read block ...passed 00:30:22.403 Test: blockdev write zeroes read no split ...passed 00:30:22.403 Test: blockdev write zeroes read split ...passed 00:30:22.662 Test: blockdev write zeroes read split partial ...passed 00:30:22.662 Test: blockdev reset ...passed 00:30:22.662 Test: blockdev write read 8 blocks ...passed 00:30:22.662 Test: blockdev write read size > 128k ...passed 00:30:22.662 Test: blockdev write read invalid size ...passed 00:30:22.662 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:22.662 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:22.662 Test: blockdev write read max offset ...passed 00:30:22.662 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:22.662 Test: blockdev writev readv 8 blocks ...passed 00:30:22.662 Test: blockdev writev readv 30 x 1block ...passed 00:30:22.662 Test: blockdev writev readv block ...passed 00:30:22.662 Test: blockdev writev readv size > 128k ...passed 00:30:22.662 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:22.662 Test: blockdev comparev and writev ...passed 00:30:22.662 Test: blockdev nvme passthru rw ...passed 00:30:22.662 Test: blockdev nvme passthru vendor specific ...passed 00:30:22.662 Test: blockdev nvme admin passthru ...passed 00:30:22.662 Test: blockdev copy ...passed 00:30:22.662 Suite: bdevio tests on: crypto_ram 00:30:22.662 Test: blockdev write read block ...passed 00:30:22.662 Test: blockdev write zeroes read block ...passed 00:30:22.662 Test: blockdev write zeroes read no split ...passed 00:30:22.922 Test: blockdev write zeroes read split ...passed 00:30:23.181 Test: blockdev write zeroes read split partial ...passed 00:30:23.181 Test: blockdev reset ...passed 00:30:23.181 Test: blockdev write read 8 blocks ...passed 00:30:23.181 Test: blockdev write read size > 128k ...passed 00:30:23.181 Test: blockdev write read invalid size ...passed 00:30:23.181 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:23.181 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:23.181 Test: blockdev write read max offset ...passed 00:30:23.181 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:23.181 Test: blockdev writev readv 8 blocks ...passed 00:30:23.181 Test: blockdev writev readv 30 x 1block ...passed 00:30:23.181 Test: blockdev writev readv block ...passed 00:30:23.181 Test: blockdev writev readv size > 128k ...passed 00:30:23.181 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:23.181 Test: blockdev comparev and writev ...passed 00:30:23.181 Test: blockdev nvme passthru rw ...passed 00:30:23.181 Test: blockdev nvme passthru vendor specific ...passed 00:30:23.182 Test: blockdev nvme admin passthru ...passed 00:30:23.182 Test: blockdev copy ...passed 00:30:23.182 00:30:23.182 Run Summary: Type Total Ran Passed Failed Inactive 00:30:23.182 suites 4 4 n/a 0 0 00:30:23.182 tests 92 92 92 0 0 00:30:23.182 asserts 520 520 520 0 n/a 00:30:23.182 00:30:23.182 Elapsed time = 1.841 seconds 00:30:23.182 0 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2964346 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2964346 ']' 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2964346 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2964346 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2964346' 00:30:23.182 killing process with pid 2964346 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2964346 00:30:23.182 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2964346 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:30:23.442 00:30:23.442 real 0m3.925s 00:30:23.442 user 0m10.633s 00:30:23.442 sys 0m0.414s 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:23.442 ************************************ 00:30:23.442 END TEST bdev_bounds 00:30:23.442 ************************************ 00:30:23.442 17:42:34 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:30:23.442 17:42:34 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:30:23.442 17:42:34 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:30:23.442 17:42:34 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:23.442 17:42:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:23.442 ************************************ 00:30:23.442 START TEST bdev_nbd 00:30:23.442 ************************************ 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2964988 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2964988 /var/tmp/spdk-nbd.sock 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2964988 ']' 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:30:23.442 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:23.442 17:42:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:23.442 [2024-07-15 17:42:34.721406] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:30:23.442 [2024-07-15 17:42:34.721455] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:23.701 [2024-07-15 17:42:34.809030] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:23.701 [2024-07-15 17:42:34.875673] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:23.701 [2024-07-15 17:42:34.896680] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:23.701 [2024-07-15 17:42:34.904704] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:23.702 [2024-07-15 17:42:34.912723] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:23.702 [2024-07-15 17:42:34.996443] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:26.284 [2024-07-15 17:42:37.147519] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:26.284 [2024-07-15 17:42:37.147564] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:26.284 [2024-07-15 17:42:37.147573] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:26.284 [2024-07-15 17:42:37.155538] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:26.284 [2024-07-15 17:42:37.155548] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:26.284 [2024-07-15 17:42:37.155554] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:26.284 [2024-07-15 17:42:37.163557] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:26.284 [2024-07-15 17:42:37.163567] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:26.284 [2024-07-15 17:42:37.163573] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:26.284 [2024-07-15 17:42:37.171576] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:26.284 [2024-07-15 17:42:37.171586] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:26.284 [2024-07-15 17:42:37.171591] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:26.284 1+0 records in 00:30:26.284 1+0 records out 00:30:26.284 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027503 s, 14.9 MB/s 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:26.284 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:26.545 1+0 records in 00:30:26.545 1+0 records out 00:30:26.545 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000195372 s, 21.0 MB/s 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:26.545 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:26.805 1+0 records in 00:30:26.805 1+0 records out 00:30:26.805 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274645 s, 14.9 MB/s 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:26.805 17:42:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:27.066 1+0 records in 00:30:27.066 1+0 records out 00:30:27.066 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000316258 s, 13.0 MB/s 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:30:27.066 { 00:30:27.066 "nbd_device": "/dev/nbd0", 00:30:27.066 "bdev_name": "crypto_ram" 00:30:27.066 }, 00:30:27.066 { 00:30:27.066 "nbd_device": "/dev/nbd1", 00:30:27.066 "bdev_name": "crypto_ram1" 00:30:27.066 }, 00:30:27.066 { 00:30:27.066 "nbd_device": "/dev/nbd2", 00:30:27.066 "bdev_name": "crypto_ram2" 00:30:27.066 }, 00:30:27.066 { 00:30:27.066 "nbd_device": "/dev/nbd3", 00:30:27.066 "bdev_name": "crypto_ram3" 00:30:27.066 } 00:30:27.066 ]' 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:30:27.066 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:30:27.066 { 00:30:27.066 "nbd_device": "/dev/nbd0", 00:30:27.066 "bdev_name": "crypto_ram" 00:30:27.066 }, 00:30:27.066 { 00:30:27.066 "nbd_device": "/dev/nbd1", 00:30:27.066 "bdev_name": "crypto_ram1" 00:30:27.066 }, 00:30:27.066 { 00:30:27.066 "nbd_device": "/dev/nbd2", 00:30:27.066 "bdev_name": "crypto_ram2" 00:30:27.066 }, 00:30:27.066 { 00:30:27.066 "nbd_device": "/dev/nbd3", 00:30:27.066 "bdev_name": "crypto_ram3" 00:30:27.066 } 00:30:27.066 ]' 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:27.325 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:27.585 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:30:27.844 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:30:27.844 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:30:27.844 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:30:27.844 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:27.844 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:27.844 17:42:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:30:27.844 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:27.844 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:27.844 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:27.844 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:28.105 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:30:28.365 /dev/nbd0 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:28.365 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:28.625 1+0 records in 00:30:28.625 1+0 records out 00:30:28.625 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264789 s, 15.5 MB/s 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:30:28.625 /dev/nbd1 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:28.625 1+0 records in 00:30:28.625 1+0 records out 00:30:28.625 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264333 s, 15.5 MB/s 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:28.625 17:42:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:30:28.885 /dev/nbd10 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:28.885 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:28.885 1+0 records in 00:30:28.885 1+0 records out 00:30:28.886 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269192 s, 15.2 MB/s 00:30:28.886 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:28.886 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:28.886 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:28.886 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:28.886 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:28.886 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:28.886 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:28.886 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:30:29.145 /dev/nbd11 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:29.146 1+0 records in 00:30:29.146 1+0 records out 00:30:29.146 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243583 s, 16.8 MB/s 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:29.146 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:30:29.405 { 00:30:29.405 "nbd_device": "/dev/nbd0", 00:30:29.405 "bdev_name": "crypto_ram" 00:30:29.405 }, 00:30:29.405 { 00:30:29.405 "nbd_device": "/dev/nbd1", 00:30:29.405 "bdev_name": "crypto_ram1" 00:30:29.405 }, 00:30:29.405 { 00:30:29.405 "nbd_device": "/dev/nbd10", 00:30:29.405 "bdev_name": "crypto_ram2" 00:30:29.405 }, 00:30:29.405 { 00:30:29.405 "nbd_device": "/dev/nbd11", 00:30:29.405 "bdev_name": "crypto_ram3" 00:30:29.405 } 00:30:29.405 ]' 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:30:29.405 { 00:30:29.405 "nbd_device": "/dev/nbd0", 00:30:29.405 "bdev_name": "crypto_ram" 00:30:29.405 }, 00:30:29.405 { 00:30:29.405 "nbd_device": "/dev/nbd1", 00:30:29.405 "bdev_name": "crypto_ram1" 00:30:29.405 }, 00:30:29.405 { 00:30:29.405 "nbd_device": "/dev/nbd10", 00:30:29.405 "bdev_name": "crypto_ram2" 00:30:29.405 }, 00:30:29.405 { 00:30:29.405 "nbd_device": "/dev/nbd11", 00:30:29.405 "bdev_name": "crypto_ram3" 00:30:29.405 } 00:30:29.405 ]' 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:30:29.405 /dev/nbd1 00:30:29.405 /dev/nbd10 00:30:29.405 /dev/nbd11' 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:30:29.405 /dev/nbd1 00:30:29.405 /dev/nbd10 00:30:29.405 /dev/nbd11' 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:30:29.405 256+0 records in 00:30:29.405 256+0 records out 00:30:29.405 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125123 s, 83.8 MB/s 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:30:29.405 256+0 records in 00:30:29.405 256+0 records out 00:30:29.405 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0443575 s, 23.6 MB/s 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:29.405 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:30:29.665 256+0 records in 00:30:29.665 256+0 records out 00:30:29.665 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0573122 s, 18.3 MB/s 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:30:29.665 256+0 records in 00:30:29.665 256+0 records out 00:30:29.665 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0338311 s, 31.0 MB/s 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:30:29.665 256+0 records in 00:30:29.665 256+0 records out 00:30:29.665 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0308193 s, 34.0 MB/s 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:30:29.665 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:29.666 17:42:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:29.925 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:30.185 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:30.445 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:30:30.705 17:42:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:30:30.965 malloc_lvol_verify 00:30:30.965 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:30:31.225 de785659-b360-4be7-bd7e-a454736d35ca 00:30:31.225 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:30:31.225 89e5887c-b409-4604-ae1d-412131560595 00:30:31.225 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:30:31.485 /dev/nbd0 00:30:31.485 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:30:31.485 mke2fs 1.46.5 (30-Dec-2021) 00:30:31.485 Discarding device blocks: 0/4096 done 00:30:31.485 Creating filesystem with 4096 1k blocks and 1024 inodes 00:30:31.485 00:30:31.485 Allocating group tables: 0/1 done 00:30:31.485 Writing inode tables: 0/1 done 00:30:31.485 Creating journal (1024 blocks): done 00:30:31.485 Writing superblocks and filesystem accounting information: 0/1 done 00:30:31.485 00:30:31.485 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:30:31.485 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:30:31.485 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:31.485 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:30:31.485 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:31.485 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:31.485 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:31.485 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2964988 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2964988 ']' 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2964988 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2964988 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2964988' 00:30:31.745 killing process with pid 2964988 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2964988 00:30:31.745 17:42:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2964988 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:30:32.005 00:30:32.005 real 0m8.531s 00:30:32.005 user 0m11.824s 00:30:32.005 sys 0m2.351s 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:32.005 ************************************ 00:30:32.005 END TEST bdev_nbd 00:30:32.005 ************************************ 00:30:32.005 17:42:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:30:32.005 17:42:43 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:30:32.005 17:42:43 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:30:32.005 17:42:43 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:30:32.005 17:42:43 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:30:32.005 17:42:43 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:32.005 17:42:43 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:32.005 17:42:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:32.005 ************************************ 00:30:32.005 START TEST bdev_fio 00:30:32.005 ************************************ 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:32.005 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:30:32.005 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:30:32.006 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:30:32.265 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:32.266 ************************************ 00:30:32.266 START TEST bdev_fio_rw_verify 00:30:32.266 ************************************ 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:32.266 17:42:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:32.525 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:32.525 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:32.525 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:32.525 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:32.525 fio-3.35 00:30:32.525 Starting 4 threads 00:30:47.422 00:30:47.422 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2967164: Mon Jul 15 17:42:56 2024 00:30:47.422 read: IOPS=35.0k, BW=137MiB/s (143MB/s)(1366MiB/10001msec) 00:30:47.422 slat (usec): min=14, max=261, avg=37.29, stdev=25.00 00:30:47.422 clat (usec): min=19, max=1866, avg=223.59, stdev=155.96 00:30:47.422 lat (usec): min=42, max=1901, avg=260.88, stdev=169.69 00:30:47.422 clat percentiles (usec): 00:30:47.422 | 50.000th=[ 169], 99.000th=[ 758], 99.900th=[ 930], 99.990th=[ 1156], 00:30:47.422 | 99.999th=[ 1565] 00:30:47.422 write: IOPS=38.5k, BW=150MiB/s (158MB/s)(1461MiB/9726msec); 0 zone resets 00:30:47.422 slat (usec): min=15, max=483, avg=47.18, stdev=24.68 00:30:47.422 clat (usec): min=16, max=1532, avg=255.58, stdev=159.23 00:30:47.422 lat (usec): min=44, max=1686, avg=302.76, stdev=172.83 00:30:47.422 clat percentiles (usec): 00:30:47.422 | 50.000th=[ 215], 99.000th=[ 783], 99.900th=[ 955], 99.990th=[ 1106], 00:30:47.422 | 99.999th=[ 1319] 00:30:47.422 bw ( KiB/s): min=127824, max=171616, per=97.38%, avg=149831.63, stdev=2967.13, samples=76 00:30:47.423 iops : min=31956, max=42904, avg=37457.89, stdev=741.78, samples=76 00:30:47.423 lat (usec) : 20=0.01%, 50=0.04%, 100=14.82%, 250=48.95%, 500=28.23% 00:30:47.423 lat (usec) : 750=6.70%, 1000=1.22% 00:30:47.423 lat (msec) : 2=0.04% 00:30:47.423 cpu : usr=99.71%, sys=0.00%, ctx=68, majf=0, minf=261 00:30:47.423 IO depths : 1=0.1%, 2=28.6%, 4=57.1%, 8=14.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:47.423 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:47.423 complete : 0=0.0%, 4=87.5%, 8=12.5%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:47.423 issued rwts: total=349616,374132,0,0 short=0,0,0,0 dropped=0,0,0,0 00:30:47.423 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:47.423 00:30:47.423 Run status group 0 (all jobs): 00:30:47.423 READ: bw=137MiB/s (143MB/s), 137MiB/s-137MiB/s (143MB/s-143MB/s), io=1366MiB (1432MB), run=10001-10001msec 00:30:47.423 WRITE: bw=150MiB/s (158MB/s), 150MiB/s-150MiB/s (158MB/s-158MB/s), io=1461MiB (1532MB), run=9726-9726msec 00:30:47.423 00:30:47.423 real 0m13.316s 00:30:47.423 user 0m50.075s 00:30:47.423 sys 0m0.421s 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:30:47.423 ************************************ 00:30:47.423 END TEST bdev_fio_rw_verify 00:30:47.423 ************************************ 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f6c90005-3501-525f-9719-1480eb7797e2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f6c90005-3501-525f-9719-1480eb7797e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "b26bf13b-5782-5965-b0af-85031dbcb35b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b26bf13b-5782-5965-b0af-85031dbcb35b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2c412943-7e80-5fdf-8e56-180ea0338620"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "2c412943-7e80-5fdf-8e56-180ea0338620",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e65f0c05-44e8-5cf7-a585-2c2d9efeab1b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "e65f0c05-44e8-5cf7-a585-2c2d9efeab1b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:30:47.423 crypto_ram1 00:30:47.423 crypto_ram2 00:30:47.423 crypto_ram3 ]] 00:30:47.423 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f6c90005-3501-525f-9719-1480eb7797e2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f6c90005-3501-525f-9719-1480eb7797e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "b26bf13b-5782-5965-b0af-85031dbcb35b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b26bf13b-5782-5965-b0af-85031dbcb35b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2c412943-7e80-5fdf-8e56-180ea0338620"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "2c412943-7e80-5fdf-8e56-180ea0338620",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e65f0c05-44e8-5cf7-a585-2c2d9efeab1b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "e65f0c05-44e8-5cf7-a585-2c2d9efeab1b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:47.424 ************************************ 00:30:47.424 START TEST bdev_fio_trim 00:30:47.424 ************************************ 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:47.424 17:42:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:47.424 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:47.424 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:47.424 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:47.424 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:47.424 fio-3.35 00:30:47.424 Starting 4 threads 00:30:59.651 00:30:59.651 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2969463: Mon Jul 15 17:43:09 2024 00:30:59.651 write: IOPS=61.7k, BW=241MiB/s (253MB/s)(2410MiB/10001msec); 0 zone resets 00:30:59.651 slat (usec): min=14, max=811, avg=40.62, stdev=27.33 00:30:59.651 clat (usec): min=19, max=1109, avg=136.11, stdev=82.80 00:30:59.651 lat (usec): min=50, max=1178, avg=176.73, stdev=97.59 00:30:59.651 clat percentiles (usec): 00:30:59.651 | 50.000th=[ 118], 99.000th=[ 408], 99.900th=[ 519], 99.990th=[ 627], 00:30:59.651 | 99.999th=[ 840] 00:30:59.651 bw ( KiB/s): min=226240, max=280704, per=100.00%, avg=247130.53, stdev=4075.42, samples=76 00:30:59.651 iops : min=56560, max=70176, avg=61782.63, stdev=1018.85, samples=76 00:30:59.651 trim: IOPS=61.7k, BW=241MiB/s (253MB/s)(2410MiB/10001msec); 0 zone resets 00:30:59.651 slat (usec): min=4, max=411, avg= 8.07, stdev= 3.72 00:30:59.651 clat (usec): min=50, max=1178, avg=176.89, stdev=97.59 00:30:59.651 lat (usec): min=55, max=1186, avg=184.96, stdev=98.27 00:30:59.651 clat percentiles (usec): 00:30:59.651 | 50.000th=[ 151], 99.000th=[ 490], 99.900th=[ 611], 99.990th=[ 742], 00:30:59.651 | 99.999th=[ 1020] 00:30:59.651 bw ( KiB/s): min=226240, max=280704, per=100.00%, avg=247130.53, stdev=4075.42, samples=76 00:30:59.651 iops : min=56560, max=70176, avg=61782.63, stdev=1018.85, samples=76 00:30:59.651 lat (usec) : 20=0.01%, 50=4.88%, 100=25.09%, 250=55.49%, 500=14.02% 00:30:59.651 lat (usec) : 750=0.50%, 1000=0.01% 00:30:59.651 lat (msec) : 2=0.01% 00:30:59.651 cpu : usr=99.73%, sys=0.00%, ctx=49, majf=0, minf=102 00:30:59.651 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:59.651 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:59.651 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:59.651 issued rwts: total=0,616849,616849,0 short=0,0,0,0 dropped=0,0,0,0 00:30:59.651 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:59.651 00:30:59.651 Run status group 0 (all jobs): 00:30:59.651 WRITE: bw=241MiB/s (253MB/s), 241MiB/s-241MiB/s (253MB/s-253MB/s), io=2410MiB (2527MB), run=10001-10001msec 00:30:59.651 TRIM: bw=241MiB/s (253MB/s), 241MiB/s-241MiB/s (253MB/s-253MB/s), io=2410MiB (2527MB), run=10001-10001msec 00:30:59.651 00:30:59.651 real 0m13.305s 00:30:59.651 user 0m49.388s 00:30:59.651 sys 0m0.403s 00:30:59.651 17:43:10 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:59.651 17:43:10 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:30:59.651 ************************************ 00:30:59.651 END TEST bdev_fio_trim 00:30:59.651 ************************************ 00:30:59.651 17:43:10 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:30:59.651 17:43:10 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:30:59.651 17:43:10 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:59.651 17:43:10 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:30:59.651 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:59.651 17:43:10 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:30:59.651 00:30:59.651 real 0m26.978s 00:30:59.651 user 1m39.642s 00:30:59.651 sys 0m1.022s 00:30:59.651 17:43:10 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:59.651 17:43:10 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:59.651 ************************************ 00:30:59.651 END TEST bdev_fio 00:30:59.651 ************************************ 00:30:59.651 17:43:10 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:30:59.651 17:43:10 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:59.652 17:43:10 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:59.652 17:43:10 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:30:59.652 17:43:10 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:59.652 17:43:10 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:59.652 ************************************ 00:30:59.652 START TEST bdev_verify 00:30:59.652 ************************************ 00:30:59.652 17:43:10 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:59.652 [2024-07-15 17:43:10.379818] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:30:59.652 [2024-07-15 17:43:10.379877] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2971277 ] 00:30:59.652 [2024-07-15 17:43:10.468788] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:59.652 [2024-07-15 17:43:10.534349] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:59.652 [2024-07-15 17:43:10.534353] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:59.652 [2024-07-15 17:43:10.555533] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:59.652 [2024-07-15 17:43:10.563560] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:59.652 [2024-07-15 17:43:10.571584] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:59.652 [2024-07-15 17:43:10.657574] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:02.203 [2024-07-15 17:43:12.891103] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:02.203 [2024-07-15 17:43:12.891199] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:02.203 [2024-07-15 17:43:12.891210] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:02.203 [2024-07-15 17:43:12.899115] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:02.203 [2024-07-15 17:43:12.899126] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:02.203 [2024-07-15 17:43:12.899138] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:02.203 [2024-07-15 17:43:12.907136] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:02.203 [2024-07-15 17:43:12.907149] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:02.203 [2024-07-15 17:43:12.907155] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:02.203 [2024-07-15 17:43:12.915157] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:02.203 [2024-07-15 17:43:12.915167] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:02.203 [2024-07-15 17:43:12.915173] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:02.203 Running I/O for 5 seconds... 00:31:07.522 00:31:07.522 Latency(us) 00:31:07.522 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:07.522 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:07.522 Verification LBA range: start 0x0 length 0x1000 00:31:07.522 crypto_ram : 5.06 603.84 2.36 0.00 0.00 211368.09 3453.24 124215.93 00:31:07.522 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:07.522 Verification LBA range: start 0x1000 length 0x1000 00:31:07.522 crypto_ram : 5.07 503.32 1.97 0.00 0.00 253628.23 4083.40 150833.62 00:31:07.522 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:07.522 Verification LBA range: start 0x0 length 0x1000 00:31:07.522 crypto_ram1 : 5.06 606.82 2.37 0.00 0.00 210043.78 3579.27 121796.14 00:31:07.522 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:07.522 Verification LBA range: start 0x1000 length 0x1000 00:31:07.522 crypto_ram1 : 5.07 504.76 1.97 0.00 0.00 252311.44 4436.28 141961.06 00:31:07.522 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:07.522 Verification LBA range: start 0x0 length 0x1000 00:31:07.522 crypto_ram2 : 5.04 4696.40 18.35 0.00 0.00 27034.45 5444.53 24399.56 00:31:07.522 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:07.522 Verification LBA range: start 0x1000 length 0x1000 00:31:07.522 crypto_ram2 : 5.04 3883.96 15.17 0.00 0.00 32651.62 6906.49 27222.65 00:31:07.522 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:07.522 Verification LBA range: start 0x0 length 0x1000 00:31:07.522 crypto_ram3 : 5.05 4702.19 18.37 0.00 0.00 26959.50 1203.59 24500.38 00:31:07.522 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:07.522 Verification LBA range: start 0x1000 length 0x1000 00:31:07.522 crypto_ram3 : 5.06 3896.82 15.22 0.00 0.00 32499.07 3654.89 27021.00 00:31:07.522 =================================================================================================================== 00:31:07.522 Total : 19398.11 75.77 0.00 0.00 52518.84 1203.59 150833.62 00:31:07.522 00:31:07.522 real 0m8.026s 00:31:07.522 user 0m15.382s 00:31:07.522 sys 0m0.313s 00:31:07.522 17:43:18 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:07.522 17:43:18 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:31:07.522 ************************************ 00:31:07.522 END TEST bdev_verify 00:31:07.522 ************************************ 00:31:07.522 17:43:18 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:31:07.522 17:43:18 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:07.522 17:43:18 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:31:07.522 17:43:18 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:07.522 17:43:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:07.522 ************************************ 00:31:07.522 START TEST bdev_verify_big_io 00:31:07.522 ************************************ 00:31:07.522 17:43:18 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:07.522 [2024-07-15 17:43:18.490433] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:07.522 [2024-07-15 17:43:18.490497] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2972551 ] 00:31:07.522 [2024-07-15 17:43:18.584713] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:07.522 [2024-07-15 17:43:18.660354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:07.522 [2024-07-15 17:43:18.660358] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:07.522 [2024-07-15 17:43:18.681500] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:31:07.522 [2024-07-15 17:43:18.689528] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:07.522 [2024-07-15 17:43:18.697553] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:07.522 [2024-07-15 17:43:18.786329] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:10.063 [2024-07-15 17:43:20.937617] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:10.063 [2024-07-15 17:43:20.937675] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:10.063 [2024-07-15 17:43:20.937684] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:10.063 [2024-07-15 17:43:20.945633] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:10.063 [2024-07-15 17:43:20.945644] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:10.063 [2024-07-15 17:43:20.945649] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:10.063 [2024-07-15 17:43:20.953652] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:10.063 [2024-07-15 17:43:20.953662] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:10.063 [2024-07-15 17:43:20.953668] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:10.063 [2024-07-15 17:43:20.961672] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:10.063 [2024-07-15 17:43:20.961682] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:10.063 [2024-07-15 17:43:20.961687] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:10.063 Running I/O for 5 seconds... 00:31:10.666 [2024-07-15 17:43:21.940890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.941319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.941374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.941415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.941452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.941489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.941871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.941887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.945188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.945230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.945266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.945303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.945780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.945819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.945855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.945891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.946246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.946256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.949482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.949521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.949558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.949594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.950002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.950044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.950101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.950138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.950637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.950647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.953792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.953831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.953867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.953903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.954315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.954352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.954389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.954425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.954803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.954817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.957961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.958002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.958038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.958077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.958524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.958563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.958599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.958635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.959010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.959021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.962236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.962276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.962314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.962371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.666 [2024-07-15 17:43:21.962798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.667 [2024-07-15 17:43:21.962838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.667 [2024-07-15 17:43:21.962874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.667 [2024-07-15 17:43:21.962910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.667 [2024-07-15 17:43:21.963366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.667 [2024-07-15 17:43:21.963376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.966234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.966273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.966309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.966347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.966775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.966813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.966850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.966886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.967336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.967346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.970316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.970356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.970392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.970427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.970860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.970900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.930 [2024-07-15 17:43:21.970936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.970972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.971320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.971330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.974257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.974299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.974336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.974374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.974825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.974864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.974900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.974949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.975399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.975410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.978318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.978359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.978395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.978432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.978848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.978886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.978923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.978963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.979462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.979474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.982456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.982500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.982537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.982573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.982993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.983032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.983068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.983105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.983474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.983484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.986958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.986997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.987033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.987075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.987516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.987555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.987592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.987628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.988086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.988097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.990839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.990878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.990915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.990954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.991390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.991430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.991466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.991503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.991966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.991977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.994789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.994834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.994870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.994906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.995318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.995356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.995396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.995433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.995875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.995898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.998917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.998957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.998998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.999034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.999496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.999534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.999570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:21.999606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.000008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.000019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.003052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.003093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.003129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.003165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.003639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.003676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.003716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.003752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.004148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.004160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.006878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.006917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.006959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.006995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.007494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.007543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.007579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.007630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.008125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.008136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.010661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.010700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.010739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.010775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.011079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.931 [2024-07-15 17:43:22.011117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.011154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.011189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.011591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.011601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.013898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.013937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.013973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.014012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.014574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.014614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.014650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.014686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.015073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.015084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.017147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.017185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.017224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.017260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.017560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.017604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.017641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.017677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.018035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.018046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.020729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.020768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.020804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.020839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.021145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.021183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.021220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.021256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.021527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.021537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.023728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.023767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.023803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.023839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.024270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.024307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.024343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.024379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.024801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.024814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.026850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.026889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.026925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.026964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.027313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.027351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.027389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.027424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.027694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.027704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.030397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.030435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.030471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.030508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.030871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.030909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.030945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.030981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.031250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.031260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.033163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.033202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.033239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.033275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.033792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.033832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.033869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.033905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.034283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.034292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.036595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.036634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.036671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.036714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.037072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.037109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.037145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.037181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.037451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.037461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.039765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.039804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.039840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.039876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.040277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.040315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.040351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.040386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.040725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.040735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.042618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.042657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.042693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.932 [2024-07-15 17:43:22.042734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.043043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.043081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.043117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.043153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.043676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.043686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.045916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.045954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.045990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.046026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.046331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.046369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.046406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.046441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.046862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.046873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.048974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.049013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.049049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.049085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.049581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.049618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.049654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.049690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.050081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.050091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.051920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.051958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.051994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.052033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.052338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.052376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.052412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.052448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.052720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.052730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.055305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.055343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.055379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.055405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.055777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.055818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.055854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.055890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.056162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.056172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.058600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.058984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.059359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.059740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.061663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.063465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.064853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.066551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.066825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.066836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.069340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.071021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.072917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.074815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.076941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.078840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.080767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.081311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.081747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.081760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.085803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.086859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.088501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.090406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.091102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.091483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.091863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.092236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.092506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.092517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.096337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.098054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.098429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.098807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.099640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.101281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.103186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.105082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.105531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.105541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.107768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.108149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.109004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.110653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.112847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.113912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.115553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.117456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.117734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.117745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.122224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.124139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.125838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.933 [2024-07-15 17:43:22.127386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.129636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.131262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.131642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.132021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.132560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.132573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.136062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.137943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.139840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.140795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.141687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.142065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.143173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.144809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.145082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.145092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.147737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.148115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.148495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.148872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.150788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.152523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.153961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.155652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.155926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.155937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.158458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.160152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.162055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.163952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.165972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.167883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.169793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.170428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.170878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.170889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.174825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.175877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.177519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.179424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.180227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.180604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.180982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.181355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.181627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.181637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.185429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.187133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.187507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.187885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.188757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.190401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.192289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.194185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.194622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.194632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.196822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.197209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.198161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.199800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.201996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.203065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.204705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.206576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.206855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.206866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.210970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.212714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.214172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.215858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.217990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.219274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.219662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.220040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.220475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.220485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:10.934 [2024-07-15 17:43:22.223882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.225796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.227666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.228385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.229239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.229615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.231259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.232954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.233227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.233237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.235484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.235865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.236241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.236714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.238855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.240759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.241797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.243436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.243712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.243722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.246110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.247748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.249662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.251563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.253584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.255496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.257391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.257774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.258204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.258214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.262097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.263163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.264803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.266726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.267370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.267753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.199 [2024-07-15 17:43:22.268127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.268924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.269233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.269243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.271982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.272360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.272736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.273110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.273842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.274217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.274592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.274969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.275449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.275459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.278193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.278569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.278946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.279323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.280072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.280457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.280834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.281208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.281598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.281609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.284219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.284596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.284978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.285353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.286209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.286587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.286965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.287343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.287844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.287855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.290499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.290879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.291254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.291629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.292357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.292736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.293112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.293496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.293855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.293866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.296495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.296875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.297251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.297627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.298420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.298800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.299175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.299549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.300020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.300032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.302603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.302981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.303357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.303736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.304565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.304944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.305319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.305693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.306206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.306218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.308741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.309124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.309512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.309889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.310724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.311099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.311473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.311854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.312251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.312266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.315363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.315744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.316118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.316492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.317233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.317610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.317988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.318361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.318731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.318742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.321332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.321712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.322098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.322471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.323286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.323663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.324043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.324418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.324843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.324854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.327637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.328019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.328395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.328776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.329551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.329932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.200 [2024-07-15 17:43:22.330307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.330681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.331078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.331089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.333665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.334048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.334442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.334824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.335619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.335999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.336377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.336771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.337193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.337204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.339734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.340112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.340487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.342149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.343731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.344109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.344484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.344862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.345164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.345174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.348492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.350407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.351393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.351772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.352530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.354091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.355780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.357680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.358009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.358019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.360027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.360406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.360785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.360816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.363046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.364965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.366023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.367673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.367953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.367963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.370212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.371880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.373796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.375696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.375737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.376151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.377813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.379723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.381629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.382013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.382407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.382418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.384396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.384434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.384470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.384506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.384779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.384820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.384857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.384894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.384930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.385265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.385275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.387035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.387074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.387110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.387146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.387557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.387597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.387633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.387669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.387705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.388103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.388113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.389711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.389750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.389786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.389821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.390103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.390145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.390182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.390218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.390254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.390522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.390531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.392510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.392548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.392586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.392622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.392917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.392965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.393006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.393042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.393079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.201 [2024-07-15 17:43:22.393349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.393359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.394972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.395948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.397962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.398789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.400416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.400463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.400500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.400535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.401005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.401047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.401083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.401119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.401155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.401626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.401637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.403367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.403405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.403444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.403483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.403852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.403894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.403930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.403966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.404002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.404329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.404339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.406305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.406343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.406379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.406414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.406863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.406908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.406945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.406980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.407017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.407289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.407299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.408897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.408938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.408974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.409010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.409280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.409321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.409358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.409394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.409430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.409697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.409707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.412155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.412199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.412235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.412271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.412604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.412645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.412681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.412721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.412758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.413026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.413036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.414633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.414671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.414713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.414750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.415140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.415188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.415225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.415262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.415298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.415721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.415735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.417648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.417686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.417725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.417761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.418028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.418072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.418109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.418145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.418182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.202 [2024-07-15 17:43:22.418625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.418635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.420393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.420432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.420468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.420504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.420867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.420908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.420945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.420980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.421016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.421409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.421420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.423918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.425870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.425909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.425949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.425985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.426315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.426356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.426392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.426427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.426463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.426819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.426830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.428408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.428446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.428482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.428518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.428789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.428833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.428870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.428906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.428942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.429338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.429348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.431511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.431549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.431585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.431621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.431892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.431937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.431973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.432009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.432045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.432313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.432323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.434034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.434072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.434108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.434144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.434588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.434629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.434666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.434702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.434742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.435106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.435116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.436825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.436863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.436902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.436938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.437319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.437374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.437410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.437446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.437482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.437800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.437811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.439803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.439841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.439878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.203 [2024-07-15 17:43:22.439920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.440428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.440472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.440509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.440545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.440581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.440930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.440940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.442597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.442635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.442672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.442708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.442985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.443028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.443064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.443101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.443136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.443404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.443413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.445652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.445690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.445743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.445780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.446099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.446148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.446184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.446220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.446255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.446523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.446533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.448200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.448238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.448277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.448313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.448586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.448628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.448665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.448702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.448741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.449193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.449203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.451160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.451199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.451236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.451272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.451544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.451595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.451632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.451668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.451704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.452186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.452196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.453815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.453853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.453889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.453925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.454295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.454335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.454372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.454408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.454445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.454920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.454931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.456566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.456604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.456640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.456676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.456948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.456989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.457026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.457062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.457099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.457373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.457383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.459477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.459515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.459554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.459590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.459966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.460009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.460045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.460080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.460116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.460413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.460423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.462168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.462206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.462242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.462277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.462544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.462589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.462629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.462666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.462702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.463034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.463044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.465489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.465528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.465564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.204 [2024-07-15 17:43:22.465600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.465904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.465945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.465981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.466024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.466060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.466333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.466342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.467981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.468020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.468693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.468734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.469190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.469230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.469267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.469304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.469340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.469836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.469850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.471506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.471544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.471583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.472923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.473226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.473268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.473304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.473340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.473376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.473644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.473654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.476974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.478636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.480504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.481834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.482106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.483823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.485735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.486948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.487334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.487713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.487724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.205 [2024-07-15 17:43:22.491279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.492984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.494742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.496647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.497004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.497382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.497760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.498134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.499668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.500062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.500072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.503525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.504808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.505204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.505578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.505980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.507295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.508955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.510755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.512165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.512438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.512449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.514536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.514916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.516554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.518302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.518575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.520091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.521780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.523687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.525598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.525979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.525989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.530028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.531956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.533233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.534913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.535187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.537103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.538282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.538665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.539041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.539460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.539474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.542700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.544548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.546487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.547371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.547880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.548259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.548634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.550202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.551882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.552156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.552166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.554886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.555265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.555639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.556016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.556319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.558264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.559926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.560758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.562693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.562968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.562978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.565654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.566038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.566413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.566791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.567149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.567527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.567905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.568280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.568671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.569067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.569078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.571621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.572002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.572378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.572755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.469 [2024-07-15 17:43:22.573231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.573613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.573991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.574366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.574743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.575243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.575254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.577842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.578219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.578594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.578985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.579362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.579747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.580123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.580498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.580884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.581261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.581271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.583834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.584211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.584585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.584963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.585451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.585836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.586212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.586587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.586965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.587472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.587482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.589982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.590362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.590741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.591122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.591509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.591891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.592266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.592642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.593032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.593423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.593434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.596067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.596443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.596821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.597196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.597657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.598040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.598415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.598792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.599165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.599717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.599727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.602477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.602864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.603239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.603616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.604016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.604395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.604781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.605156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.605529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.605931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.605942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.608432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.608814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.609190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.609566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.610091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.610470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.610850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.611225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.611598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.612105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.612118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.614633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.615017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.615393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.615770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.616156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.616544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.616924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.617299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.617677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.618064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.618075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.620692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.621079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.621455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.621832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.622385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.622768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.623143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.623517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.623895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.624405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.624415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.626932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.627310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.627684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.628063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.470 [2024-07-15 17:43:22.628463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.628856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.629231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.629604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.630895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.631316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.631326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.633667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.634046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.634422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.634800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.635311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.636070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.637713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.639612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.641521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.641906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.641917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.643978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.644359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.645330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.646969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.647240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.649157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.650230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.651878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.653797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.654067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.654076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.658099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.660009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.661918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.662988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.663334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.665255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.667156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.667557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.667933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.668384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.668394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.671085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.672732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.674650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.676556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.677079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.677457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.677836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.678212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.679871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.680142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.680153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.683627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.684415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.684794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.685168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.685540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.687223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.689065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.690964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.692028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.692322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.692333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.694538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.694922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.696610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.698483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.698759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.700136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.701835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.703749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.705653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.706073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.706083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.710019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.711924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.713157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.714802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.715079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.717003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.718103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.718476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.718852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.719278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.719289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.722516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.724271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.726168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.727203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.727750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.728128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.728503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.729953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.731618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.731898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.731908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.734858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.735236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.735610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.735988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.736293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.471 [2024-07-15 17:43:22.737948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.739838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.741483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.743174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.743446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.743456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.745839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.747156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.748815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.750672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.750975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.752515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.754211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.756123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.757321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.757810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.757821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.761626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.472 [2024-07-15 17:43:22.763328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.765061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.766827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.767098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.768619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.768998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.769373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.769753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.770065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.770075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.773365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.775278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.776558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.776950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.777324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.777705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.778841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.780478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.782335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.782606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.782620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.784595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.784977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.785015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.785389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.785700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.787349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.789277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.790877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.792558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.792835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.792845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.795196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.796384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.798040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.798078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.798349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.800061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.801607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.803302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.805217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.805603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.805615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.808856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.810490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.810528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.810564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.810599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.811078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.811119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.811155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.811191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.811226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.811610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.811620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.813539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.813578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.813614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.813650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.813925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.813966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.814002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.814038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.814075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.814428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.814438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.816184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.816223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.816259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.816295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.816650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.816694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.816735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.816771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.816808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.817197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.817207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.818806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.818857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.818893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.818929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.819259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.819303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.819340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.737 [2024-07-15 17:43:22.819376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.819411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.819681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.819692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.821603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.821641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.821677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.821718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.822072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.822113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.822149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.822185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.822221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.822567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.822577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.824208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.824248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.824284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.824324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.824594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.824634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.824671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.824706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.824748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.825123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.825133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.827591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.827630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.827669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.827706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.827981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.828023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.828059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.828095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.828131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.828399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.828408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.830093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.830132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.830168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.830204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.830686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.830735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.830772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.830807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.830845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.831224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.831236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.833336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.833379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.833414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.833450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.833726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.833767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.833804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.833839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.833876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.834150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.834160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.835868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.835907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.835943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.835979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.836393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.836434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.836470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.836506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.836542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.836925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.836937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.838536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.838575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.838613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.838649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.838967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.839009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.839046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.839082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.839118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.839394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.839404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.841393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.841432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.841468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.841504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.841835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.841880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.841916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.841952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.841988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.842336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.842346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.843914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.843953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.843989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.844024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.738 [2024-07-15 17:43:22.844294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.844335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.844371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.844407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.844443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.844809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.844819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.846958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.846997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.847033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.847069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.847337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.847378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.847415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.847454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.847491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.847766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.847776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.849485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.849524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.849563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.849599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.850060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.850103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.850140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.850176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.850212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.850579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.850590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.852563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.852603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.852639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.852675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.852981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.853026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.853063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.853099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.853135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.853409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.853419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.855308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.855348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.855384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.855420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.855868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.855909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.855946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.855982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.856018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.856402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.856413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.858842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.860850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.860890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.860931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.860967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.861251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.861293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.861333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.861369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.861417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.861686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.861696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.863309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.863348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.863387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.863423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.863694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.863742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.863779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.863819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.863855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.864275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.864285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.866313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.866352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.866387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.866424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.866690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.866735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.866771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.866813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.866849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.867134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.867144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.868746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.868784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.868821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.868857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.739 [2024-07-15 17:43:22.869319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.869360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.869397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.869432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.869468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.869879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.869893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.871636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.871675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.871718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.871755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.872028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.872069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.872105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.872140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.872176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.872449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.872459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.874593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.874633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.874669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.874705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.874980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.875023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.875060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.875096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.875132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.875399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.875409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.877956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.880064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.880103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.880138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.880174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.880699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.880746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.880783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.880819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.880855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.881203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.881213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.883377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.883415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.883456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.883492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.883891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.883932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.883968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.884004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.884040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.884430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.884442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.886591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.886630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.886668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.886703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.887105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.887154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.887192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.887228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.887264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.887700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.887716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.890400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.890439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.890475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.890511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.890986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.891028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.891065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.891100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.891138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.891671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.891681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.893894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.893933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.894309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.894346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.894736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.894777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.894813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.894849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.894885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.895259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.895270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.897409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.897447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.897483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.897866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.898316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.740 [2024-07-15 17:43:22.898358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.898394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.898430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.898466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.898914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.898924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.901368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.901752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.902129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.902503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.902957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.903336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.903714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.904102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.904476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.904872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.904883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.907692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.908073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.908448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.908827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.909218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.909596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.909986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.910361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.910741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.911131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.911141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.913660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.914048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.914423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.914801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.915268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.915646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.916026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.916402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.916784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.917311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.917322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.919817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.920194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.920568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.920948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.921328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.921718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.922095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.922468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.922847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.923233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.923244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.926036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.926414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.926793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.927168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.741 [2024-07-15 17:43:22.927659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.928042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.928417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.928794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.929170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.929634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.929645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.932045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.932421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.932803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.933178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.933648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.934032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.934408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.934786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.935160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.935519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.935529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.938319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.938698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.939077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.939452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.939842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.940222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.940596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.940975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.941350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.941737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.941747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.944225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.944606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.944987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.945361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.945805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.946184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.946560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.946946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.947324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.947825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.947836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.949994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.950373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.950756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.951131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.951519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.951901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.952277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.952652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.954296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.954567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.954578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.958073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.958570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.958949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.959324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.959724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.961414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.963331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.965233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.966293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.966639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.966649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.968992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.969370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.971088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.972997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.973277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.974418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.976066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.977973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.979886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.980258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.980268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.984112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.986023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.987169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.988815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.989087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.990996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.992145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.992520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.992905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.993337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.993347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.996677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:22.998608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.000537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.001381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.001874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.002253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.002628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.004150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.005833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.006105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.006115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.008932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.742 [2024-07-15 17:43:23.009321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.009701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.010081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.010384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.012042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.013854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.015369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.017075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.017347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.017357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.019976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.021394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.023030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.024961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.025275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.026773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:11.743 [2024-07-15 17:43:23.028452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.006 [2024-07-15 17:43:23.030315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.006 [2024-07-15 17:43:23.031929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.006 [2024-07-15 17:43:23.032396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.006 [2024-07-15 17:43:23.032406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.036232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.038123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.039185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.040816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.041091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.043001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.043832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.044207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.044585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.044993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.045008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.048256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.050132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.052031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.052792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.053283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.053663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.054043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.055598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.057296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.057568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.057578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.060346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.060742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.061116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.061490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.061791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.063436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.065345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.067040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.068718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.068995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.069005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.071477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.072779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.074434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.076226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.076498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.077922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.079581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.081491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.082741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.083263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.083273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.087086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.088903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.090401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.092125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.092396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.094109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.094487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.094865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.095243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.095585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.095597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.098864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.100788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.102544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.102924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.103313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.103692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.104267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.105878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.107747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.108024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.108034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.110577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.110960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.111336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.111715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.111989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.113703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.115610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.116925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.118638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.118917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.118927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.121334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.123054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.124758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.126658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.127011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.128729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.130571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.132465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.133458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.133954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.133965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.137655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.139141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.140834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.142720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.142997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.144390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.144773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.145148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.145525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.145836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.007 [2024-07-15 17:43:23.145847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.149210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.151092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.152363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.152758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.153110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.153490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.154650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.156294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.158171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.158444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.158455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.160447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.160829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.161205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.162422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.162753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.164653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.166563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.167775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.169419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.169696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.169707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.172665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.174307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.176214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.177977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.178425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.180068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.181969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.183704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.184086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.184470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.184481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.188356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.189450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.191084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.192961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.193237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.194043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.194419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.194796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.195177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.195450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.195460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.199103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.200993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.201451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.201832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.202281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.202662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.204517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.206263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.208150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.208506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.208516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.210458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.210841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.211217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.212846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.213156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.215067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.216824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.218478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.220253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.220532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.220544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.224028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.225680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.225724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.227621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.227981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.229579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.231272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.233154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.234467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.234989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.234999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.238835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.240649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.242072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.242111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.242445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.244365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.246048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.246423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.246806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.247347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.247357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.008 [2024-07-15 17:43:23.249875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.251888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.251927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.251966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.252002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.252397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.252437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.252474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.252510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.252547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.252820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.252830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.254445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.254484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.254520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.254556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.255019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.255060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.255096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.255132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.255168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.255540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.255550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.257523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.257562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.257599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.257635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.257916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.257957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.257994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.258032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.258068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.258532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.258542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.260300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.260339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.260375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.260411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.260803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.260848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.260885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.260922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.260959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.261440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.261451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.263793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.263832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.263869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.263905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.264264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.264305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.264342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.264378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.264415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.264837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.264847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.267149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.267196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.267236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.267275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.267785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.267836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.267879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.267915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.267951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.268384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.268394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.270442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.270482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.270522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.270558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.271035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.271077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.271116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.271151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.271188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.271574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.271584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.273961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.274954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.277185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.277224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.277260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.277296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.277667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.277707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.277750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.277787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.277848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.278424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.278435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.009 [2024-07-15 17:43:23.280908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.280953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.280990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.281026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.281451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.281492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.281528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.281564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.281600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.282126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.282137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.284231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.284270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.284309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.284347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.284869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.284912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.284949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.284988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.285027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.285423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.285433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.287646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.287685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.287725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.287762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.288245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.288289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.288328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.288363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.288399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.288783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.288795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.290940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.290979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.291016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.291052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.291418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.291459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.291495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.291531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.291567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.291937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.291947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.294489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.294528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.294565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.294601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.295016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.295058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.295103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.295139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.295176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.295610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.295620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.297782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.297822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.297862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.297898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.298286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.298327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.298363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.298399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.298436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.298949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.298959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.301129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.301170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.301209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.010 [2024-07-15 17:43:23.301246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.301724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.301770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.301809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.301845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.301882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.302215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.302225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.304390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.304429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.304465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.304504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.304981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.305022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.305059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.305095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.305131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.305495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.305506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.307780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.307819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.307856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.307892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.308234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.308275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.308312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.308348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.308383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.308771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.308784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.310837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.310876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.310915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.310951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.311225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.311266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.311303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.311339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.311375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.311765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.311778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.313803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.313846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.313882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.313917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.314187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.314232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.314270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.314307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.314343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.314778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.314788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.316759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.316797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.316833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.316870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.317163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.273 [2024-07-15 17:43:23.317203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.317240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.317275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.317312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.317820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.317834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.319776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.319815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.319851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.319887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.320221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.320262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.320299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.320335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.320371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.320869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.320880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.322931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.322970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.323011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.323046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.323418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.323460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.323496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.323532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.323568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.323941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.323953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.326966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.328992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.329947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.332442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.332481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.332517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.332555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.333059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.333100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.333136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.333172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.333208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.333525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.333536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.335639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.335678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.335722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.335758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.336205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.336248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.336284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.336321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.336357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.336739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.336750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.338852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.338892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.338927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.338963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.339307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.339353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.339390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.339426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.339461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.339857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.339867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.341866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.341917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.341953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.341989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.342263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.342306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.342342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.342378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.342414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.342790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.342802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.344762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.344801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.346574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.346613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.274 [2024-07-15 17:43:23.347088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.347130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.347167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.347203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.347239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.347553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.347564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.349682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.349726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.349769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.350723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.351000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.351043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.351079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.351114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.351150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.351527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.351537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.353871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.354251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.354626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.355007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.355441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.355824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.356200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.356576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.357745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.358074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.358084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.361443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.362998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.363374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.363752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.364264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.364953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.366596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.368478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.370385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.370832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.370843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.372954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.373333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.374049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.375688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.375967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.377873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.378917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.380552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.382454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.382732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.382743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.386667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.388556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.390451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.391493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.391785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.393658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.395565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.396562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.396942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.397287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.397298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.400418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.402156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.404064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.405945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.406327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.406716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.407092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.407467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.409061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.409410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.409420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.412882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.414117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.414506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.414885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.415318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.416397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.418029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.419925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.421648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.422093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.422104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.424225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.424604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.425706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.427345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.427617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.429533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.430685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.432328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.434214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.434488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.434499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.438670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.440562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.442477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.443766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.275 [2024-07-15 17:43:23.444117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.446030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.447835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.448209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.448587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.449061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.449072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.451781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.453442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.455299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.456900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.457396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.457778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.458155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.458834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.460452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.460728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.460739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.464288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.464667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.465046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.465420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.465809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.467452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.469322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.471206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.472271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.472577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.472587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.474920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.475300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.477010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.478908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.479179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.480548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.482231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.484144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.486029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.486402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.486412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.490543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.492462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.494172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.495858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.496133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.497982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.499546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.499926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.500306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.500802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.500812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.503781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.505422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.507208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.508780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.509265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.509644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.510024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.510866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.512509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.512788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.512799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.516254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.516637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.517015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.517390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.517755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.519390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.521295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.523211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.524334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.524657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.524667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.526983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.527901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.529539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.531463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.531743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.532812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.534458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.536357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.538272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.538707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.538722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.542641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.544561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.545871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.547522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.547802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.549566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.549950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.550325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.550702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.551062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.551072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.554357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.556217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.276 [2024-07-15 17:43:23.557768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.558146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.558531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.558919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.559865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.561508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.563411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.563685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.563695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.565793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.566171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.566547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.567730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.277 [2024-07-15 17:43:23.568052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.540 [2024-07-15 17:43:23.569951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.540 [2024-07-15 17:43:23.571870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.540 [2024-07-15 17:43:23.573116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.540 [2024-07-15 17:43:23.574775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.540 [2024-07-15 17:43:23.575050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.540 [2024-07-15 17:43:23.575059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.540 [2024-07-15 17:43:23.578116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.579768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.581682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.583296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.583617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.585281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.587045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.588532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.588911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.589286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.589296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.592885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.594564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.596252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.598141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.598505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.598891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.599268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.599642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.600989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.601310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.601321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.604780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.606064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.606441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.606819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.607261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.608413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.610067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.611962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.613660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.613985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.613996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.616109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.616487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.617954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.619607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.619889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.621553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.623201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.624872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.626783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.627109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.627119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.631172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.632910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.634469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.636226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.636500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.638327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.639876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.640250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.640624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.641106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.641116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.644309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.646125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.648023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.649266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.649752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.650131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.650505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.651712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.653361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.653632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.653642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.656713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.657091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.658864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.659240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.659594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.661229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.663113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.665017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.666153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.666455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.666465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.668755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.669607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.671242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.673166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.673441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.674135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.675774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.677712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.679611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.680126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.680136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.682930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.683308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.683682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.684061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.684487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.684869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.685245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.685620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.685998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.686471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.541 [2024-07-15 17:43:23.686481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.689221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.689604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.689982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.690359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.690736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.691114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.691489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.691868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.692249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.692627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.692638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.695260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.695637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.696016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.696391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.696887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.697267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.697642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.698021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.698396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.698895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.698908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.701495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.701876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.702252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.702637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.703014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.703393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.703771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.704147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.704535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.704909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.704919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.707523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.707904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.707943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.708321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.708716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.709107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.709482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.709859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.710236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.710636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.710647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.713363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.713746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.714120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.714158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.714560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.714942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.715318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.715692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.716072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.716491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.716502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.718679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.718722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.718759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.718795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.719202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.719243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.719283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.719320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.719356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.719788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.719799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.722302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.722341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.722378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.722416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.722816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.722857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.722893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.722928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.722964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.723371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.723381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.725739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.725778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.725814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.725851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.726239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.726281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.726317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.726353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.726389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.726780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.726790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.729003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.729041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.729077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.729116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.729525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.729566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.729602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.729638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.729689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.730090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.730100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.732732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.542 [2024-07-15 17:43:23.732771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.732807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.732843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.733197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.733241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.733278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.733323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.733360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.733845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.733855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.736001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.736040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.736076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.736111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.736492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.736532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.736569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.736604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.736651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.737181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.737191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.739408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.739450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.739486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.739522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.739900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.739940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.739977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.740013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.740049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.740475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.740485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.742900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.742939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.742988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.743049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.743449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.743510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.743547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.743583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.743619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.744090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.744100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.746245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.746294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.746331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.746367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.746833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.746878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.746916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.746951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.746987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.747446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.747462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.749736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.749774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.749821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.749858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.750337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.750379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.750415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.750451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.750487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.750860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.750871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.752975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.753014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.753062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.753097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.753564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.753604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.753642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.753678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.753717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.754083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.754093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.756313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.756352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.756390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.756427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.756909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.756951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.756999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.757039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.757076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.757515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.757526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.759542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.759581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.759616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.759652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.759923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.759968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.760005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.760041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.760077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.760398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.760408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.762064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.762102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.543 [2024-07-15 17:43:23.762138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.762173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.762543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.762585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.762621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.762657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.762699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.763200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.763211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.764897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.764942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.764977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.765013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.765280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.765324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.765361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.765397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.765433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.765701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.765715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.767734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.767773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.767811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.767849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.768225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.768266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.768302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.768338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.768374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.768663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.768673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.773798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.773840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.773876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.773911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.774418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.774463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.774499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.774537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.774573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.774939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.774950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.778916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.778957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.778998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.779034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.779301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.779342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.779378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.779414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.779450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.779721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.779731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.782686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.782730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.782767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.782803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.783071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.783111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.783149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.783186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.783222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.783549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.783559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.787474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.787515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.787898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.787937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.787973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.544 [2024-07-15 17:43:23.788244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.805 [2024-07-15 17:43:23.909333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.909392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.909745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.909789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.910138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.910184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.911651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.911989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.911999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.912008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.920176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.922080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.923855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.924303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.924313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.926467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.926848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.928162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.929803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.931943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.933300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.934956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.936832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.937104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.937115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.941216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.943126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.945016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.946380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.948561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.950386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.950763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.951136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.951606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.951616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.954485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.956142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.957900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.959277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.960166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.960552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.961595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.963226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.963498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.963508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.966704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.967086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.967461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.967840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.969774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.971678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.973332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.974693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.975001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.975011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.977356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.978502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.980141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.982048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.983419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.985064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.986948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.988623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.989045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.989055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.992924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.994752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.996204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.997874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:23.999828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.000205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.000579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.000962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.001345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.001355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.004996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.005516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.005902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.006277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.008300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.010202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.012089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.013137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.013452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.013462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.015767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.016146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.017779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.019708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.021040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.022740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.024664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.026572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.026964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.026975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.029783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.030162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.030540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.030919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.806 [2024-07-15 17:43:24.031678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.032059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.032432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.032811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.033193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.033203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.035995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.036380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.036760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.037135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.037883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.038276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.038650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.039030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.039434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.039445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.042088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.042465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.042844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.043219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.044071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.044458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.044837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.045212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.045748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.045759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.048222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.048600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.048983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.049358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.050117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.050493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.050870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.051245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.051625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.051636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.054143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.054520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.054898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.055276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.056142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.056518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.056897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.057272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.057792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.057804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.060256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.060637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.061016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.061398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.062206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.062583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.062960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.063337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.063803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.063814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.066340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.066720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.067094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.067472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.068229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.068605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.068983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.069359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.069703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.069719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.072227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.072604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.072983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.073358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.074126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.074504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.074882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.075256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.075633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.075644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.078221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.078614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.078998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.079373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.080190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.080567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.080962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.081337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.081814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.081826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.084470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.084854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.085231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.085609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.086403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.086783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.087158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.087536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.087936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.087947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.091250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.091637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.092015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.807 [2024-07-15 17:43:24.092389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.093170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.093550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.093928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.094301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.094762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.094773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.097381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.097425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.098146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.099494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.101108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:12.808 [2024-07-15 17:43:24.101870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.102407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.104049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.104318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.104329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.107851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.108344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.108722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.108759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.109658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.111321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.113207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.115102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.115497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.115508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.118589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.118628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.119005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.119043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.119832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.119871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.120862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.120899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.121195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.121205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.124735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.124775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.126492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.126529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.071 [2024-07-15 17:43:24.127377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.127416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.127804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.127842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.128184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.128193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.131467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.131507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.133357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.133395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.135266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.135306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.135680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.135720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.136043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.136053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.142406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.142448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.144343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.144380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.145604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.145643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.146020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.146057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.146531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.146542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.149301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.149341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.150979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.151016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.153182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.153221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.154168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.154206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.154597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.154607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.158297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.158337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.160256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.160294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.161928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.161971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.163661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.163699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.163976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.163986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.167489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.167528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.169209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.169246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.171042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.171081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.172841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.172879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.173148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.173158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.176275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.176314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.176690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.176732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.178715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.178754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.180644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.180688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.180964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.180975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.183003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.183042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.183415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.183452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.184947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.184987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.186637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.186675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.186953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.186963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.189940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.072 [2024-07-15 17:43:24.189980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.191997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.192027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.193038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.193078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.193113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.194197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.194623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.194635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.196263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.196301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.196337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.196372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.196735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.196773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.196808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.196844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.197111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.197121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.199998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.201552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.201590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.201626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.201661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.201964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.202008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.202044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.202081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.202523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.202533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.204528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.204567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.204603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.204638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.204965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.205003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.205040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.205076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.205346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.205356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.206946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.206984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.207019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.207055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.207354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.207392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.207428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.207471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.207890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.207900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.209881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.209919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.209954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.209990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.210289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.210327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.210363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.210406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.210679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.210690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.212336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.212375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.212421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.212461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.212768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.073 [2024-07-15 17:43:24.212806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.212843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.212879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.213256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.213267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.215956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.217566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.217605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.217641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.217676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.218122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.218160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.218196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.218232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.218733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.218744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.220426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.220464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.220500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.220536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.220988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.221026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.221062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.221097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.221412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.221422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.223138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.223176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.223213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.223254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.223602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.223639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.223674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.223713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.224194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.224205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.225861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.225899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.225935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.225971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.226338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.226376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.226412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.226448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.226723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.226733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.228808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.228846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.228882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.228918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.229334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.229372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.229408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.229443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.229717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.229728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.231355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.231394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.231430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.231466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.231772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.231810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.231846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.231882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.232155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.232165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.234123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.234161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.234197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.074 [2024-07-15 17:43:24.234236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.234657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.234694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.234733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.234782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.235052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.235063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.236740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.236778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.236814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.236850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.237153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.237190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.237226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.237261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.237531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.237540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.240949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.242550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.242588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.242624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.242660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.243073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.243122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.243158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.243193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.243466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.243477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.245790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.245828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.245865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.245904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.246294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.246332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.246368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.246404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.246674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.246684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.248278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.248316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.248352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.248387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.248810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.248848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.248884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.248920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.249355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.249365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.253141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.253181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.253217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.253253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.253755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.253793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.253829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.253865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.254196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.254206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.255991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.256030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.256669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.256706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.257075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.257112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.075 [2024-07-15 17:43:24.257147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.257183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.257584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.257595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.260802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.260841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.262724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.262762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.263066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.263718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.263755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.264128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.264600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.264610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.267339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.267378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.269008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.269046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.269348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.271240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.271277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.272309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.272643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.272653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.276446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.276486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.278411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.278449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.278754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.280278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.280317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.282109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.282383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.282393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.285845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.285884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.287582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.287619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.287929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.289278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.289316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.290949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.291222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.291232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.294336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.294379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.294756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.294793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.295180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.296822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.296860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.298754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.299028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.299037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.301087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.301129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.301503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.301539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.302059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.303102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.303140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.304799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.305072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.305082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.308323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.308362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.310161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.310199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.310717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.311559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.311596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.312616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.076 [2024-07-15 17:43:24.313015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.313025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.316263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.316307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.317802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.317840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.318381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.318759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.318796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.319169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.319487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.319497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.322772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.322813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.324687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.324722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.325077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.326782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.326819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.327195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.327552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.327562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.329355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.331254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.331292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.331327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.332387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.332685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.332729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.334636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.336549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.336587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.337093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.337104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.341068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.342973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.343712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.345354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.345625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.347525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.348580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.350350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.350727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.351106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.351116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.353490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.353869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.354244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.354618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.354981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.356137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.356514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.358312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.358686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.359059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.359069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.361598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.361979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.363776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.364153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.077 [2024-07-15 17:43:24.364510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.365902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.366278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.366652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.367034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.367499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.367511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.371176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.371554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.373160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.373535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.373938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.374319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.374693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.375072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.375457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.375871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.340 [2024-07-15 17:43:24.375883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.379672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.380056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.380432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.380816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.381319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.381698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.382081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.382454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.382834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.383107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.383118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.385808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.386186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.386561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.386939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.387349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.387733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.388688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.389790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.390164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.390438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.390448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.393208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.393588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.393967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.394342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.394621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.395169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.395820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.397194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.397573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.397936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.397948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.400499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.400985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.402528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.402908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.403207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.404126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.404502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.404881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.405257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.405653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.405663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.409053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.409430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.411194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.411574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.411954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.412333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.412707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.413086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.413460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.413843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.413853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.417536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.417918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.418294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.418669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.419080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.419461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.419839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.420214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.420904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.421183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.421193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.423803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.424184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.424558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.424937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.425366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.425752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.426932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.427770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.428149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.428421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.428431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.430999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.431379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.341 [2024-07-15 17:43:24.431757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.432133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.432403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.432786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.433768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.434708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.435086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.435540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.435551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.438246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.438999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.440233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.440609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.440885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.441415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.441866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.443282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.444633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.445007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.445018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.447793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.449724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.450099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.451237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.451640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.452024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.452400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.453009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.454484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.454909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.454927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.457971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.458849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.459224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.461021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.461507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.462363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.464002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.465892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.467726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.468118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.468128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.470655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.471036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.472757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.473132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.473448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.475087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.476995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.478707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.480195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.480502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.480512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.482694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.484132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.484508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.486267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.486544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.488295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.489767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.491461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.493373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.493653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.493663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.496722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.497099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.498747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.500622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.500898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.502095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.503785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.505663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.507523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.507910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.507920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.342 [2024-07-15 17:43:24.510423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.512057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.513933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.515818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.516275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.517926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.519794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.521698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.522636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.522916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.522926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.526829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.528717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.530623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.530661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.531107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.532759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.534653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.536551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.537518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.537874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.537885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.541641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.543538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.545380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.546780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.547069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.548983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.550675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.552070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.552506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.552889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.552900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.556516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.556556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.557715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.557753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.558042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.559912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.561806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.562794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.564411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.564854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.564865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.568592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.568631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.570523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.570562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.571030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.572670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.572711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.574595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.574632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.574908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.574918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.577260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.577299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.578545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.578582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.578883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.580787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.580826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.582413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.582451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.582728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.582739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.584675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.584718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.585938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.585976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.586501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.586883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.586921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.588553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.588590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.588862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.588874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.343 [2024-07-15 17:43:24.592376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.592416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.593642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.593680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.594192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.594573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.594612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.596122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.596165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.596547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.596558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.599772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.599816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.601696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.601736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.602011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.603056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.603094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.604637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.604675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.605063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.605074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.608717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.608757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.610667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.610706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.611082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.612760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.612798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.614719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.614761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.615033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.615044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.617259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.617299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.619038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.619076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.619347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.621253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.621291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.622389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.622427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.622725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.622736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.624794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.624835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.626421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.626458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.626912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.627374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.627413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.629032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.629070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.629340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.629351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.632883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.632923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.634242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.635014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.344 [2024-07-15 17:43:24.635405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.637214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.637257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.637630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.637670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.637979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.637990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.639632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.641322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.643214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.643251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.643580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.645223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.645262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.645298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.645672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.645995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.646006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.647884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.647922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.647958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.647994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.648264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.648305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.648342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.648379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.648415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.648747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.648757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.650401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.650440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.650476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.650512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.650974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.651018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.651055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.607 [2024-07-15 17:43:24.651090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.651126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.651432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.651443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.653458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.653496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.653532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.653567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.653839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.653880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.653917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.653953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.653989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.654336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.654346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.656012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.656055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.656092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.656127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.656591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.656633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.656669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.656705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.656747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.657038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.657048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.659895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.661482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.661522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.661558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.661593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.662048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.662090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.662127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.662163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.662199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.662475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.662486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.664658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.664698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.664744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.664783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.665056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.665097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.665134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.665170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.665206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.665478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.665493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.667075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.667114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.667149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.667185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.667594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.667638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.667674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.667715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.667752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.668023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.668034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.670230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.670269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.670304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.670347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.670617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.670658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.670695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.670736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.670772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.671040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.671050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.672644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.672683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.672724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.672760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.673137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.673177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.673214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.673253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.673289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.673559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.673569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.675582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.675620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.675665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.675700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.676002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.676044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.676080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.676116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.608 [2024-07-15 17:43:24.676152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.676420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.676430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.678089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.678127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.678163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.678198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.678583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.678629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.678665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.678701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.678752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.679020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.679030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.681925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.683586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.683623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.683659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.683694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.684080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.684121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.684157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.684193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.684240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.684510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.684520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.686485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.686524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.686563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.686598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.686944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.686985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.687021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.687057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.687093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.687365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.687375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.689052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.689091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.689130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.689166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.689568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.689613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.689649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.689686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.689731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.690000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.690010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.691973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.692904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.694596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.694634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.694670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.694707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.695077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.695118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.695154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.695196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.695233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.695500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.695511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.697434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.697473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.697513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.697548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.697876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.697917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.697953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.697989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.698025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.698341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.698351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.700061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.700105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.700141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.700516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.700793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.700838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.700876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.700911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.700947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.609 [2024-07-15 17:43:24.701386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.701397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.702987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.703815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.705625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.707324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.707362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.707738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.708070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.708112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.708148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.708183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.708219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.708587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.708597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.710176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.712082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.712119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.713267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.713540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.713583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.713959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.713997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.715219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.715649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.715659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.717406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.718965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.719005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.720896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.721230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.721272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.722959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.722997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.723371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.723707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.723721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.725572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.727488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.727526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.728577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.728885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.728927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.730833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.730870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.732791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.733258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.733268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.735261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.736383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.736420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.738075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.738351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.738392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.740085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.740123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.741832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.742103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.742114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.743846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.745023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.745061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.745616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.745998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.746039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.747639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.747676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.749584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.749858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.749869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.754243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.754622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.754660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.755937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.756370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.756412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.756792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.756830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.757216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.757550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.757561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.759794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.760169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.760208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.760764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.761037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.761078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.761454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.761490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.763100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.763565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.610 [2024-07-15 17:43:24.763575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.766953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.767335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.767372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.767750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.768291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.768332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.770170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.770207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.770580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.770904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.770914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.773041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.774223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.774261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.774297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.774689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.774734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.776461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.776499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.776877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.777238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.777249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.780959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.781002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.781047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.782104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.782463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.782505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.782882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.784604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.784641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.785137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.785150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.787787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.789034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.789409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.789786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.790267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.792035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.792411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.793676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.794508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.794898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.794908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.797967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.798346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.799886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.800376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.800764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.802643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.803021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.803395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.803773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.804054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.804064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.806776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.807294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.808780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.809154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.809451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.810332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.810707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.811089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.812178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.812528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.812538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.815589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.816365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.817588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.817968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.818365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.818749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.820513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.820890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.822061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.822511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.822522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.824934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.826445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.826823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.827199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.827632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.828622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.611 [2024-07-15 17:43:24.829562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.829940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.831668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.832173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.832186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.835227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.835606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.836032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.837621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.838048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.838922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.840030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.840404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.840786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.841156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.841167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.843482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.843870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.845166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.845817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.846202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.848016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.848392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.848769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.849147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.849419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.849430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.853906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.854284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.855497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.856313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.856693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.857077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.857630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.858985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.859360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.859645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.859655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.862978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.863355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.865116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.865494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.865901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.866280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.867751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.868184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.868832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.869109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.869120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.873531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.875258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.875634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.876013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.876428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.877433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.878481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.878860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.880765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.881317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.881328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.883567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.885137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.885515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.885894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.886380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.888088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.888464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.889459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.890422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.890810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.890820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.894342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.895990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.897899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.899773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.900137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.612 [2024-07-15 17:43:24.901077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.902955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.904765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.905685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.905963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.905974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.909955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.911859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.913740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.914778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.915063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.917047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.918944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.920211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.922025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.922544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.922555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.928266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.930035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.931949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.933849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.934292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.936066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.936441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.937470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.938493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.938899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.938910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.942296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.944204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.946115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.947043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.947318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.947699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.948794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.949724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.950110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.950382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.950394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.956415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.957997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.958373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.959853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.960381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.960767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.962399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.964311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.966209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.966654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.966665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.969717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.970095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.971674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.972057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.876 [2024-07-15 17:43:24.972432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.974096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.976006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.977890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.978947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.979297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.979308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.984876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.985654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.987267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.989146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.989423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.990487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.992135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.994034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.995758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.996148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.996158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:24.998575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.000242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.002138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.004038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.004416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.006029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.007925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.009830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.010761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.011088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.011099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.015985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.017122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.018795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.018833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.019109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.020818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.022213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.022663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.023383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.023677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.023687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.026808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.028543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.030450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.032354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.032785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.034457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.034836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.035952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.036808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.037188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.037199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.043923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.043966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.044921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.044959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.045235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.045615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.046867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.047532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.048164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.048521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.048531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.052106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.052150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.054047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.054085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.054462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.055754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.055793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.056167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.056214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.056483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.056495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.062023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.062065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.063852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.063890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.064207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.065644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.065683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.066062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.066100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.066465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.066476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.070128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.070168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.071219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.071258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.071561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.073467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.073506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.075413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.075452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.075915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.075929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.081123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.081166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.082785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.082823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.877 [2024-07-15 17:43:25.083097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.085005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.085043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.085728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.085766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.086193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.086203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.089997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.090038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.091081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.091119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.091439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.093368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.093407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.095316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.095355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.095812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.095824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.100801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.100843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.102485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.102523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.102800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.104707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.104751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.105168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.105205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.105581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.105591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.109451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.109491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.110555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.110594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.110948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.112842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.112880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.114802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.114840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.115341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.115352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.120356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.120399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.122034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.122072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.122345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.124246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.124285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.124818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.124857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.125242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.125253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.129074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.129115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.130165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.131803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.132075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.133996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.134035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.134601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.134639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.135055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.135067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.139056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.140701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.142613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.142651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.142926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.143726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.143765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.143801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.144174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.144629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.144639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.146367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.146406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.146442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.146477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.146901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.146944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.146981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.147017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.147053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.147343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.147354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.151789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.151830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.151866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.151909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.152274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.152318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.152354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.152390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.152426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.152693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.152704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.154300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.154339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.154375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.154411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.154812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.878 [2024-07-15 17:43:25.154869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.154906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.154942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.154978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.155406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.155416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.159490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.159532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.159567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.159603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.159913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.159956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.159992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.160028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.160064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.160334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.160344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.162290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.162330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.162366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.162402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.162754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.162796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.162833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.162869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.162905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.163205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.163214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.168336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.168377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.168413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.168449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.168923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.168966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.169003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.169039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.169075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.169436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.169448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.171304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.171343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.171383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:13.879 [2024-07-15 17:43:25.171419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.171871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.171915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.171952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.171988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.172023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.172334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.172344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.176528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.176570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.176606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.176642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.176948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.176993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.177031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.177067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.177103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.177372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.177382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.179168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.179207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.179243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.179278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.179688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.179739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.179776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.179812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.179849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.180363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.180373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.184462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.184503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.184539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.184574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.184852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.184894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.184935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.184971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.185019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.185286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.185297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.187941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.187981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.188020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.188055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.188365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.188406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.188443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.188478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.188514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.188787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.188798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.194435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.194477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.194513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.194549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.194946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.195006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.195043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.195079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.195115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.195542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.195552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.197990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.202323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.202365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.202401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.202436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.202704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.202751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.202787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.202823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.202859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.203126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.203136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.204919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.204957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.204996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.205034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.205544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.205585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.143 [2024-07-15 17:43:25.205621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.205656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.205692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.206076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.206087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.208897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.208943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.208979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.209015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.209429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.209473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.209509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.209569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.209606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.210092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.210103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.212374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.212413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.212452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.212489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.212899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.212940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.212977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.213012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.213048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.213435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.213445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.216213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.216254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.216293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.216329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.216670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.216716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.216754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.216790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.216828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.217221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.217236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.219587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.219628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.219667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.219705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.220223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.220266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.220302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.220338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.220374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.220771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.220782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.223583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.223626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.223662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.224041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.224476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.224522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.224558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.224594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.224640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.225179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.225189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.227340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.227379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.227415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.227451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.227878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.227920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.227956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.227993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.228033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.228524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.228534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.231578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.231978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.232017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.232393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.232774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.232818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.232855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.232902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.232938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.233413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.233422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.235552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.235933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.235971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.236345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.236690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.236737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.237111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.237149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.237522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.237932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.237943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.240812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.241191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.241228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.241602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.242068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.144 [2024-07-15 17:43:25.242113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.242487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.242524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.242903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.243401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.243412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.245543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.245932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.245971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.246345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.246883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.246926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.247299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.247354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.247733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.248215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.248226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.251438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.251832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.251870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.252244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.252611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.252653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.253033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.253071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.253445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.253944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.253955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.256974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.257354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.257401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.257783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.258139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.258180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.258554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.258591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.258970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.259383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.259393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.262385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.262770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.262807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.263180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.263567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.263608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.263988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.264027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.264401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.264879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.264890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.268007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.268401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.268442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.268819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.269201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.269242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.269627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.269664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.270467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.270767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.270780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.274671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.275055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.275094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.275467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.275910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.275954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.276328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.276365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.276759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.277186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.277197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.280335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.280719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.280756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.280792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.281209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.281253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.281626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.281664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.282041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.282445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.282456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.287996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.288038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.288078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.289972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.290302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.290344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.290722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.291096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.291132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.291565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.291577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.298428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.300025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.300401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.300780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.301278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.302168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.303804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.305706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.307543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.307934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.307945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.312394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.145 [2024-07-15 17:43:25.314037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.315930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.317820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.318363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.320029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.321911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.323800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.324305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.324742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.324753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.330429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.332332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.334228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.334972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.335420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.335804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.336182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.337839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.339739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.340010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.340020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.345368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.345752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.347334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.349035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.349309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.350993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.352704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.354424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.356341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.356685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.356696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.362178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.363831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.365586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.367498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.367870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.368250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.368626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.369003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.370579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.370929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.370939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.376399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.376783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.377158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.378413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.378727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.380628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.382478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.383863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.385513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.385796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.385808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.391969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.393781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.395203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.396868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.397145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.398847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.399224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.399598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.399980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.400324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.400335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.406582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.406967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.407342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.407721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.408072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.409719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.411620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.413509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.414586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.414893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.414903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.420827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.422713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.424610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.425673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.425983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.427890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.429801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.430344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.430723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.431207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.431218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.146 [2024-07-15 17:43:25.437009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.438892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.439913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.440289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.440662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.441048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.442607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.444261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.446125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.446416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.446427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.452109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.453203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.454852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.456703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.456984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.458291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.459948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.461699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.463088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.463578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.463596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.468678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.470335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.472249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.473912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.474409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.474793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.475168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.475984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.477619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.477897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.477908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.483793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.484172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.484546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.486157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.486433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.488347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.489636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.491325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.493199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.493472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.493483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.498980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.500452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.502138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.504039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.504312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.505701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.506081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.506456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.506838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.507153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.507163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.513629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.514014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.514389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.514767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.515096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.516745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.518630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.520544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.521885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.522186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.522196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.528160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.530045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.531925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.532986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.533293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.535180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.537077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.537841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.538214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.538614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.538625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.544242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.546134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.547207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.547582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.547961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.408 [2024-07-15 17:43:25.548344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.549885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.551556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.553461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.553790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.553800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.559445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.560647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.562289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.564139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.564412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.565700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.567364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.569163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.570719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.571175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.571186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.577308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.579210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.580354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.580736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.581108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.581487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.582858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.584501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.586335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.586607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.586618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.592317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.593054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.594698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.594742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.595015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.596938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.598000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.599653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.601555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.601832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.601842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.606787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.607840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.609480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.611387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.611662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.612245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.612619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.612997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.613576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.613904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.613915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.620721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.620763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.621138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.621177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.621542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.621925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.622301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.622678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.623055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.623502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.623514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.626762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.626804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.627182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.627219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.627606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.627990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.628029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.628404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.628464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.629041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.629052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.632400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.632441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.632827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.632865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.633227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.633605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.633643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.634019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.634057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.634437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.634447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.637578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.637625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.638005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.638043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.638496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.638877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.638914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.639287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.639329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.409 [2024-07-15 17:43:25.639787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.639798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.643060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.643102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.643475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.643512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.643908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.644298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.644335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.644708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.644749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.645161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.645171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.648346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.648387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.648763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.648803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.649292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.649673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.649713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.650088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.650125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.650512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.650522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.653791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.653832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.654207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.654244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.654637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.655019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.655059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.655437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.655474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.655866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.655876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.659013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.659054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.659428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.659466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.659921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.660301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.660338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.660714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.660752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.661147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.661158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.664613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.664655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.665032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.665069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.665449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.665832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.665870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.666244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.666281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.666727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.666737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.669906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.669947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.670329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.670703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.671104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.671483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.671521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.671898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.671936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.672331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.672342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.675256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.675635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.676014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.676051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.676411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.676794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.676845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.676881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.677255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.677661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.677671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.682014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.682064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.682100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.682136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.682602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.682645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.682682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.682721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.682757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.683169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.683179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.686338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.686379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.410 [2024-07-15 17:43:25.686419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.686455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.686890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.686932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.686968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.687004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.687040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.687532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.687542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.690618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.690663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.690699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.690738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.691146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.691188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.691224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.691259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.691308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.691826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.691837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.694959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.694999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.695035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.695071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.695475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.695517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.695554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.695591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.695627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.695965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.695979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.701070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.701111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.701146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.701183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.701644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.701685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.701725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.701761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.701798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.702175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.411 [2024-07-15 17:43:25.702185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.706143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.706184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.706220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.706256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.706610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.706652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.706688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.706730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.706767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.707039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.707049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.710920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.710961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.711000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.711040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.711311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.711354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.711390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.711427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.711467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.711881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.711891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.717096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.717136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.717172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.717208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.717595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.717640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.717677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.717719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.717756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.718051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.718061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.723199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.723239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.723275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.723310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.723856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.723900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.723936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.723973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.724009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.724379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.724389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.728410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.728451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.728487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.728523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.728856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.728902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.728938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.728974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.729009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.729278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.729288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.733622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.733663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.733702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.733743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.734015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.734057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.734103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.734139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.734175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.734535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.734545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.739786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.739833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.739869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.739905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.740296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.740341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.740377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.740413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.740456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.740728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.740739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.674 [2024-07-15 17:43:25.745940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.745981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.746017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.746056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.746576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.746617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.746653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.746688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.746728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.747138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.747148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.751124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.751166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.751202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.751237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.751577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.751618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.751654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.751691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.751731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.752001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.752011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.755887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.755927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.755967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.756003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.756276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.756325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.756361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.756397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.756432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.756821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.756831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.761128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.761174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.761210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.761246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.761641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.761687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.761727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.761763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.761799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.762098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.762109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.767232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.767273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.767309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.767346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.767789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.767831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.767869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.767905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.767941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.768304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.768314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.772316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.772357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.772394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.772430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.772699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.772743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.772780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.772816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.772852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.773126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.773136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.776207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.776248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.776288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.776323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.776594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.776635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.776672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.776712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.776749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.777108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.777118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.781030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.781071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.781107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.782763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.783049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.783093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.783131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.783167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.783203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.783472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.783482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.788418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.788459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.788495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.788531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.789001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.789042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.789079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.675 [2024-07-15 17:43:25.789122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.789158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.789533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.789543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.793875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.795687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.795728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.796103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.796497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.796540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.796576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.796623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.796659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.797135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.797145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.801988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.803903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.803941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.805461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.805898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.805942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.806315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.806353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.806730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.807064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.807074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.811736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.813442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.813480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.813857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.814246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.814299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.814673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.814714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.816010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.816332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.816342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.821492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.821877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.821915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.822290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.822751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.822795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.823986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.824024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.825683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.825956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.825966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.831438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.831820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.831858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.832231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.832538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.832580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.834232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.834270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.836068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.836342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.836352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.841726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.842104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.842145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.843471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.843772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.843821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.845701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.845744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.847242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.847525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.847535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.851495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.852793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.852831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.854513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.854789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.854845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.856319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.856356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.858089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.858362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.858372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.862573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.864249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.864287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.866175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.866504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.866546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.868340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.868379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.870261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.870535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.870548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.874210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.875998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.876035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.877328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.877603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.676 [2024-07-15 17:43:25.877646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.879385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.879423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.881310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.881705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.881718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.885634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.887106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.887145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.887189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.887459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.887502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.889392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.889430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.891309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.891693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.891703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.897233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.897275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.897311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.898956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.899230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.899272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.900546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.902181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.902223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.902713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.902723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.907701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.909337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.911248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.913038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.913438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.913821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.914196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.914746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.916386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.916659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.916669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.922113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.923081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.924723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.926630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.926909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.927989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.929626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.931540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.933247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.933666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.933676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.938611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.940254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.942154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.944064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.944450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.944836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.945211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.945584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.947282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.947555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.947565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.953300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.953684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.954063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.955685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.955985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.957748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.959324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.961049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.962812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.963084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.963095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.677 [2024-07-15 17:43:25.968927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.970634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.971980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.973629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.973906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.975617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.975996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.976370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.976746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.977156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.977167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.980462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.980846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.981222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.981600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.981981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.982361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.982741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.983114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.983488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.983861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.983872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.987167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.987549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.987927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.988302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.988770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.989153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.989528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.989907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.990287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.990803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.990814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.994294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.994672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.995051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.995426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.995816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.996204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.996578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.996955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.997330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.940 [2024-07-15 17:43:25.997731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:25.997742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.000883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.001265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.001640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.002017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.002476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.002858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.003233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.003608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.003986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.004498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.004509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.007868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.008246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.008621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.008998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.009351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.009734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.010110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.010485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.010864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.011257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.011267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.013797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.014178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.014565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.014942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.015378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.015761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.016136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.016510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.016892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.017360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.017371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.020002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.020379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.020758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.021142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.021513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.021897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.022272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.022646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.023026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.023416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.023427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.026033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.026410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.026788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.027163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.027642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.028024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.028400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.028778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.029153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.029638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.029649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.032151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.032528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.032907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.033954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.034316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.035766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.036315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.036691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.037067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.037458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.037468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.039947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.040328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.040703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.041081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.041559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.041943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.042318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.042691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.043072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.043474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.043484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.046083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.046459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.046837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.047212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.047642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.048026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.048401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.050098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.051999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.052280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.052290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.054881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.055258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.055633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.056011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.056295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.057937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.059676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.061179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.062866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.941 [2024-07-15 17:43:26.063138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.063148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.065635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.066943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.068591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.070421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.070704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.072205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.073901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.075783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.077109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.077608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.077618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.081453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.083212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.084668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.086346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.086617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.088316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.088692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.089069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.089447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.089788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.089799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.093069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.094907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.096442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.096480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.096977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.097357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.097745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.098687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.100322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.100594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.100604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.104005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.104382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.104760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.105139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.105507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.107130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.109016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.110922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.111985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.112322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.112333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.114641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.114682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.115177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.115214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.115545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.117448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.119350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.120403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.122025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.122300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.122313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.124600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.124640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.126272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.126310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.126581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.128491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.128530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.129563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.129600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.129904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.129914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.132247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.132287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.132661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.132698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.133012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.134930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.134967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.136866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.136903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.137453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.137464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.139411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.139451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.139830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.139868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.140248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.141978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.142016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.143895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.143936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.144208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.144219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.146460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.146500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.146878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.146915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.147402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.147783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.147821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.149440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.149478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.149750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.942 [2024-07-15 17:43:26.149762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.153283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.153322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.153697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.153737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.154139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.154518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.154556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.155187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.155224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.155527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.155537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.159201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.159240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.161034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.161071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.161483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.161865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.161906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.162280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.162318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.162692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.162703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.165943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.165982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.167900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.167944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.168214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.168594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.168632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.169009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.169065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.169593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.169605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.172270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.172310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.173958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.173996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.174266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.176176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.176214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.176588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.176625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.177016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.177027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.180958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.180998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.182202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.183867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.184141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.186045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.186084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.186457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.186495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.186885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.186896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.188963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.190878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.192080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.192118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.192469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.194360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.194398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.194435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.196131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.196535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.196546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.198545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.198583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.198619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.198655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.198929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.198972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.199009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.199054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.199091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.199386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.199396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.201021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.201073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.201111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.201146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.201593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.201635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.201671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.201707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.201749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.202195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.202205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.203942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.203980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.204016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.204052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.204472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.204513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.204549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.204585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.204621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.204988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.204999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.943 [2024-07-15 17:43:26.206919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.206957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.207005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.207042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.207514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.207556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.207592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.207628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.207664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.207984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.207994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.209650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.209689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.209729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.209766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.210035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.210076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.210112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.210149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.210185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.210454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.210464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.212702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.212745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.212782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.212819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.213110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.213156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.213193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.213229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.213265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.213534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.213545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.215206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.215244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.215285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.215321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.215593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.215634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.215670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.215715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.215752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.216219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.216229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.218156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.218194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.218237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.218273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.218543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.218590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.218626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.218662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.218699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.219072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.219083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.220718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.220756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.220792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.220829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.221201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.221242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.221279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.221315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.221355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.221820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.221831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.223480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.223519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.223556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.223591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.223919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.223961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.223998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.224034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.224070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.224337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.224347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.226431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.226469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.226508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.226544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.226915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.226955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.944 [2024-07-15 17:43:26.226991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.227027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.227064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.227333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.227343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.228996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.229808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.232177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.232215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.232255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.232290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.232608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.232650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.232687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.232727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.232763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.233030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.233040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.234652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.234690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.234730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.234766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.235126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:14.945 [2024-07-15 17:43:26.235167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.235214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.235252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.235289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.235758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.235769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.237643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.237681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.237723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.237759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.238030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.238070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.238107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.238143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.238178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.238642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.238652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.207 [2024-07-15 17:43:26.240367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.240406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.240442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.240477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.240881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.240934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.240971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.241007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.241043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.241485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.241495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.243231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.243269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.243305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.243341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.243610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.243650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.243687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.243728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.243765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.244036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.244047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.246530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.246568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.246603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.246639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.246936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.246978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.247014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.247050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.247090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.247361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.247371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.248950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.248988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.249028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.249064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.249461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.249502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.249539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.249574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.249610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.250007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.250020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.251924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.251962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.251998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.253905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.254290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.254334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.254370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.254406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.254442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.254755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.254766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.256701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.256744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.256791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.256828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.257329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.257370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.257412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.257448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.257484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.257810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.257821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.259452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.261124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.261162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.263042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.263404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.263446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.263482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.263526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.263563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.264052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.264063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.265957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.267870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.267909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.269060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.269332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.269375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.271293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.271331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.273238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.273620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.273630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.276141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.277781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.277819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.279681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.279961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.280003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.281392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.281430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.208 [2024-07-15 17:43:26.283104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.283377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.283387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.285424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.285804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.285843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.286216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.286749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.286791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.287164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.287201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.287574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.288031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.288041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.290268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.290643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.290680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.291058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.291551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.291593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.291971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.292009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.292383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.292747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.292758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.294952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.295332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.295371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.295749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.296172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.296212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.296587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.296624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.297001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.297435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.297445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.299722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.300098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.300135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.300509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.300930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.300972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.301346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.301384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.301763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.302181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.302192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.304414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.304793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.304846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.305220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.305653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.305694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.306071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.306109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.306484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.306964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.306975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.309293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.309675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.309718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.310093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.310486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.310528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.310906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.310944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.311318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.311693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.311704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.314042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.314419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.314456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.314492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.314982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.315023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.315398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.315437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.315813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.316189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.316199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.318685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.318727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.318779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.319153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.319573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.319614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.319994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.320372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.320410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.320783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.320795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.323292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.323669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.324048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.209 [2024-07-15 17:43:26.324423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.324875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.325255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.325629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.326009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.326384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.326801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.326812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.329343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.329723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.330099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.330472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.330919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.331300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.331674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.332049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.332425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.332904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.332915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.335358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.335737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.336112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.336486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.336960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.337342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.337721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.338096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.338472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.338899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.338909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.341376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.341756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.342132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.342777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.343118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.344123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.344990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.345364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.345740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.346123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.346133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.348603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.348985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.349361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.349738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.350300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.350682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.351060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.351434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.351811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.352285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.352295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.355004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.355383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.355767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.356141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.356613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.356996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.357372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.357752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.358125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.358563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.358574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.361815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.363695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.365581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.366399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.366891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.367270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.367645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.369272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.371163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.371435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.371445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.373902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.374281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.374656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.375628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.375963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.377869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.379767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.380812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.382448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.382723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.382734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.385683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.387348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.389263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.390887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.391266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.392925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.394836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.396447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.396824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.397190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.397201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.400616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.402284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.404161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.406012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.210 [2024-07-15 17:43:26.406356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.406740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.407116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.407491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.409256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.409530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.409542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.413025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.413852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.414229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.414603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.414956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.416597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.418483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.420394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.421461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.421785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.421796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.424118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.425115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.426748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.428628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.428907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.429982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.431631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.433516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.435208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.435608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.435618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.439228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.440776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.442567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.444377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.444651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.446346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.446724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.447097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.447470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.447744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.447755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.451270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.453169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.454087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.454465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.454864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.455242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.456876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.458770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.460675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.461150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.461160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.463087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.463468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.464463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.466103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.466373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.468287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.469344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.470983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.472858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.473129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.473139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.477135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.479052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.480737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.482255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.482624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.484548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.486305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.486681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.487059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.487521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.487532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.490756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.492660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.494554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.495554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.496068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.496448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.496830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.498473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.500365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.500638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.211 [2024-07-15 17:43:26.500649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.472 [2024-07-15 17:43:26.503166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.472 [2024-07-15 17:43:26.503544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.472 [2024-07-15 17:43:26.503924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.472 [2024-07-15 17:43:26.504998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.472 [2024-07-15 17:43:26.505293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.472 [2024-07-15 17:43:26.507176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.509081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.510128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.511771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.512044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.512055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.515091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.516753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.518663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.520302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.521347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:15.473 [2024-07-15 17:43:26.521729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:16.044 00:31:16.044 Latency(us) 00:31:16.044 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:16.044 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:16.044 Verification LBA range: start 0x0 length 0x100 00:31:16.044 crypto_ram : 5.82 43.98 2.75 0.00 0.00 2810788.63 377487.36 2232660.28 00:31:16.044 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:16.044 Verification LBA range: start 0x100 length 0x100 00:31:16.044 crypto_ram : 6.17 41.52 2.59 0.00 0.00 2993745.92 203262.42 2735976.76 00:31:16.044 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:16.044 Verification LBA range: start 0x0 length 0x100 00:31:16.044 crypto_ram1 : 5.82 43.97 2.75 0.00 0.00 2701582.97 377487.36 2013265.92 00:31:16.044 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:16.044 Verification LBA range: start 0x100 length 0x100 00:31:16.044 crypto_ram1 : 6.17 41.51 2.59 0.00 0.00 2870305.08 203262.42 2490771.30 00:31:16.044 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:16.044 Verification LBA range: start 0x0 length 0x100 00:31:16.044 crypto_ram2 : 5.61 307.81 19.24 0.00 0.00 369092.31 34683.67 564617.85 00:31:16.044 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:16.044 Verification LBA range: start 0x100 length 0x100 00:31:16.044 crypto_ram2 : 5.74 241.79 15.11 0.00 0.00 463911.63 13208.02 593655.34 00:31:16.044 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:16.044 Verification LBA range: start 0x0 length 0x100 00:31:16.044 crypto_ram3 : 5.73 320.50 20.03 0.00 0.00 342591.11 15728.64 451694.28 00:31:16.044 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:16.044 Verification LBA range: start 0x100 length 0x100 00:31:16.044 crypto_ram3 : 5.86 255.28 15.95 0.00 0.00 424971.04 35893.56 392006.10 00:31:16.044 =================================================================================================================== 00:31:16.044 Total : 1296.36 81.02 0.00 0.00 730529.10 13208.02 2735976.76 00:31:16.304 00:31:16.304 real 0m9.062s 00:31:16.304 user 0m17.447s 00:31:16.304 sys 0m0.302s 00:31:16.304 17:43:27 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:16.304 17:43:27 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:31:16.304 ************************************ 00:31:16.304 END TEST bdev_verify_big_io 00:31:16.304 ************************************ 00:31:16.304 17:43:27 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:31:16.304 17:43:27 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:16.304 17:43:27 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:16.304 17:43:27 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:16.304 17:43:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:16.304 ************************************ 00:31:16.305 START TEST bdev_write_zeroes 00:31:16.305 ************************************ 00:31:16.305 17:43:27 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:16.564 [2024-07-15 17:43:27.617371] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:16.564 [2024-07-15 17:43:27.617421] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2974074 ] 00:31:16.564 [2024-07-15 17:43:27.703407] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:16.564 [2024-07-15 17:43:27.766843] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:16.564 [2024-07-15 17:43:27.787850] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:31:16.564 [2024-07-15 17:43:27.795876] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:16.564 [2024-07-15 17:43:27.803894] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:16.825 [2024-07-15 17:43:27.890165] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:18.762 [2024-07-15 17:43:30.038297] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:18.762 [2024-07-15 17:43:30.038360] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:18.762 [2024-07-15 17:43:30.038369] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:18.762 [2024-07-15 17:43:30.046314] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:18.762 [2024-07-15 17:43:30.046325] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:18.762 [2024-07-15 17:43:30.046330] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:18.762 [2024-07-15 17:43:30.054333] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:18.762 [2024-07-15 17:43:30.054344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:18.762 [2024-07-15 17:43:30.054349] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:19.023 [2024-07-15 17:43:30.062354] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:19.023 [2024-07-15 17:43:30.062365] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:19.023 [2024-07-15 17:43:30.062371] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:19.023 Running I/O for 1 seconds... 00:31:19.962 00:31:19.962 Latency(us) 00:31:19.962 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:19.962 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:19.962 crypto_ram : 1.02 2358.37 9.21 0.00 0.00 53874.58 4814.38 64931.05 00:31:19.962 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:19.962 crypto_ram1 : 1.02 2363.94 9.23 0.00 0.00 53483.27 4814.38 60091.47 00:31:19.962 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:19.962 crypto_ram2 : 1.02 18258.82 71.32 0.00 0.00 6908.44 2142.52 9124.63 00:31:19.962 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:19.962 crypto_ram3 : 1.02 18291.28 71.45 0.00 0.00 6877.35 2117.32 7208.96 00:31:19.962 =================================================================================================================== 00:31:19.962 Total : 41272.42 161.22 0.00 0.00 12265.52 2117.32 64931.05 00:31:20.222 00:31:20.222 real 0m3.835s 00:31:20.222 user 0m3.567s 00:31:20.222 sys 0m0.235s 00:31:20.222 17:43:31 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:20.222 17:43:31 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:31:20.222 ************************************ 00:31:20.222 END TEST bdev_write_zeroes 00:31:20.222 ************************************ 00:31:20.222 17:43:31 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:31:20.222 17:43:31 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:20.222 17:43:31 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:20.222 17:43:31 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:20.222 17:43:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:20.222 ************************************ 00:31:20.222 START TEST bdev_json_nonenclosed 00:31:20.222 ************************************ 00:31:20.222 17:43:31 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:20.494 [2024-07-15 17:43:31.531581] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:20.494 [2024-07-15 17:43:31.531635] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2974695 ] 00:31:20.494 [2024-07-15 17:43:31.620771] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:20.494 [2024-07-15 17:43:31.696215] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:20.494 [2024-07-15 17:43:31.696275] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:31:20.494 [2024-07-15 17:43:31.696286] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:20.494 [2024-07-15 17:43:31.696293] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:20.494 00:31:20.494 real 0m0.278s 00:31:20.494 user 0m0.169s 00:31:20.494 sys 0m0.107s 00:31:20.494 17:43:31 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:31:20.494 17:43:31 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:20.494 17:43:31 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:31:20.494 ************************************ 00:31:20.494 END TEST bdev_json_nonenclosed 00:31:20.494 ************************************ 00:31:20.802 17:43:31 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:31:20.802 17:43:31 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:31:20.802 17:43:31 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:20.802 17:43:31 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:20.803 17:43:31 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:20.803 17:43:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:20.803 ************************************ 00:31:20.803 START TEST bdev_json_nonarray 00:31:20.803 ************************************ 00:31:20.803 17:43:31 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:20.803 [2024-07-15 17:43:31.889058] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:20.803 [2024-07-15 17:43:31.889108] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2974725 ] 00:31:20.803 [2024-07-15 17:43:31.961260] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:20.803 [2024-07-15 17:43:32.026487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:20.803 [2024-07-15 17:43:32.026543] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:31:20.803 [2024-07-15 17:43:32.026555] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:20.803 [2024-07-15 17:43:32.026561] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:21.065 00:31:21.065 real 0m0.249s 00:31:21.065 user 0m0.161s 00:31:21.065 sys 0m0.084s 00:31:21.065 17:43:32 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:31:21.065 17:43:32 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:21.065 17:43:32 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:31:21.065 ************************************ 00:31:21.065 END TEST bdev_json_nonarray 00:31:21.065 ************************************ 00:31:21.065 17:43:32 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:31:21.065 17:43:32 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:31:21.065 00:31:21.065 real 1m8.923s 00:31:21.065 user 2m46.424s 00:31:21.065 sys 0m6.212s 00:31:21.065 17:43:32 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:21.065 17:43:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:21.065 ************************************ 00:31:21.065 END TEST blockdev_crypto_qat 00:31:21.065 ************************************ 00:31:21.065 17:43:32 -- common/autotest_common.sh@1142 -- # return 0 00:31:21.065 17:43:32 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:31:21.065 17:43:32 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:31:21.065 17:43:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:21.065 17:43:32 -- common/autotest_common.sh@10 -- # set +x 00:31:21.065 ************************************ 00:31:21.065 START TEST chaining 00:31:21.065 ************************************ 00:31:21.065 17:43:32 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:31:21.065 * Looking for test storage... 00:31:21.065 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:21.065 17:43:32 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@7 -- # uname -s 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:21.065 17:43:32 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:21.065 17:43:32 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:21.065 17:43:32 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:21.065 17:43:32 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:21.065 17:43:32 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:21.065 17:43:32 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:21.065 17:43:32 chaining -- paths/export.sh@5 -- # export PATH 00:31:21.065 17:43:32 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@47 -- # : 0 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:21.065 17:43:32 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:31:21.065 17:43:32 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:31:21.065 17:43:32 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:31:21.065 17:43:32 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:31:21.065 17:43:32 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:31:21.065 17:43:32 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:21.065 17:43:32 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:21.065 17:43:32 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:31:21.065 17:43:32 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:31:21.065 17:43:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@296 -- # e810=() 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@297 -- # x722=() 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@298 -- # mlx=() 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:31:31.066 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:31:31.066 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:31:31.066 Found net devices under 0000:4b:00.0: cvl_0_0 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:31:31.066 Found net devices under 0000:4b:00.1: cvl_0_1 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:31:31.066 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:31:31.066 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.506 ms 00:31:31.066 00:31:31.066 --- 10.0.0.2 ping statistics --- 00:31:31.066 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:31.066 rtt min/avg/max/mdev = 0.506/0.506/0.506/0.000 ms 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:31:31.066 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:31:31.066 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.297 ms 00:31:31.066 00:31:31.066 --- 10.0.0.1 ping statistics --- 00:31:31.066 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:31.066 rtt min/avg/max/mdev = 0.297/0.297/0.297/0.000 ms 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@422 -- # return 0 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:31:31.066 17:43:40 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:31:31.066 17:43:41 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:31:31.066 17:43:41 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:31:31.066 17:43:41 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.067 17:43:41 chaining -- nvmf/common.sh@481 -- # nvmfpid=2978986 00:31:31.067 17:43:41 chaining -- nvmf/common.sh@482 -- # waitforlisten 2978986 00:31:31.067 17:43:41 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@829 -- # '[' -z 2978986 ']' 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:31.067 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.067 [2024-07-15 17:43:41.075400] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:31.067 [2024-07-15 17:43:41.075459] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:31.067 [2024-07-15 17:43:41.163104] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:31.067 [2024-07-15 17:43:41.261673] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:31:31.067 [2024-07-15 17:43:41.261741] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:31:31.067 [2024-07-15 17:43:41.261750] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:31:31.067 [2024-07-15 17:43:41.261758] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:31:31.067 [2024-07-15 17:43:41.261765] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:31:31.067 [2024-07-15 17:43:41.261792] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@862 -- # return 0 00:31:31.067 17:43:41 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.067 17:43:41 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:31:31.067 17:43:41 chaining -- bdev/chaining.sh@69 -- # mktemp 00:31:31.067 17:43:41 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.2Ja4iKOcI5 00:31:31.067 17:43:41 chaining -- bdev/chaining.sh@69 -- # mktemp 00:31:31.067 17:43:41 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.6nK8oNZB6a 00:31:31.067 17:43:41 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:31:31.067 17:43:41 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.067 17:43:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.067 malloc0 00:31:31.067 true 00:31:31.067 true 00:31:31.067 [2024-07-15 17:43:42.034210] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:31.067 crypto0 00:31:31.067 [2024-07-15 17:43:42.042236] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:31:31.067 crypto1 00:31:31.067 [2024-07-15 17:43:42.050372] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:31:31.067 [2024-07-15 17:43:42.066615] tcp.c: 981:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@85 -- # update_stats 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:31.067 17:43:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.2Ja4iKOcI5 bs=1K count=64 00:31:31.067 64+0 records in 00:31:31.067 64+0 records out 00:31:31.067 65536 bytes (66 kB, 64 KiB) copied, 0.00101296 s, 64.7 MB/s 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.2Ja4iKOcI5 --ob Nvme0n1 --bs 65536 --count 1 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@25 -- # local config 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:31.067 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:31.067 "subsystems": [ 00:31:31.067 { 00:31:31.067 "subsystem": "bdev", 00:31:31.067 "config": [ 00:31:31.067 { 00:31:31.067 "method": "bdev_nvme_attach_controller", 00:31:31.067 "params": { 00:31:31.067 "trtype": "tcp", 00:31:31.067 "adrfam": "IPv4", 00:31:31.067 "name": "Nvme0", 00:31:31.067 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:31.067 "traddr": "10.0.0.2", 00:31:31.067 "trsvcid": "4420" 00:31:31.067 } 00:31:31.067 }, 00:31:31.067 { 00:31:31.067 "method": "bdev_set_options", 00:31:31.067 "params": { 00:31:31.067 "bdev_auto_examine": false 00:31:31.067 } 00:31:31.067 } 00:31:31.067 ] 00:31:31.067 } 00:31:31.067 ] 00:31:31.067 }' 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.2Ja4iKOcI5 --ob Nvme0n1 --bs 65536 --count 1 00:31:31.067 17:43:42 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:31.067 "subsystems": [ 00:31:31.067 { 00:31:31.067 "subsystem": "bdev", 00:31:31.067 "config": [ 00:31:31.067 { 00:31:31.067 "method": "bdev_nvme_attach_controller", 00:31:31.067 "params": { 00:31:31.067 "trtype": "tcp", 00:31:31.067 "adrfam": "IPv4", 00:31:31.067 "name": "Nvme0", 00:31:31.067 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:31.067 "traddr": "10.0.0.2", 00:31:31.067 "trsvcid": "4420" 00:31:31.067 } 00:31:31.067 }, 00:31:31.067 { 00:31:31.067 "method": "bdev_set_options", 00:31:31.067 "params": { 00:31:31.067 "bdev_auto_examine": false 00:31:31.067 } 00:31:31.067 } 00:31:31.067 ] 00:31:31.067 } 00:31:31.067 ] 00:31:31.067 }' 00:31:31.328 [2024-07-15 17:43:42.387209] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:31.328 [2024-07-15 17:43:42.387280] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2979170 ] 00:31:31.328 [2024-07-15 17:43:42.479546] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:31.328 [2024-07-15 17:43:42.574164] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:31.850  Copying: 64/64 [kB] (average 10 MBps) 00:31:31.850 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:31.850 17:43:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.850 17:43:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.850 17:43:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:31.850 17:43:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.850 17:43:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.850 17:43:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:31.850 17:43:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:31.850 17:43:43 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:31.851 17:43:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.851 17:43:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:31.851 17:43:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:31.851 17:43:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.851 17:43:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.851 17:43:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@96 -- # update_stats 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:31.851 17:43:43 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:31.851 17:43:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:31.851 17:43:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.851 17:43:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:32.110 17:43:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:32.110 17:43:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:32.110 17:43:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:32.110 17:43:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:32.110 17:43:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:32.110 17:43:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:32.110 17:43:43 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:32.111 17:43:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:32.111 17:43:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:32.111 17:43:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.6nK8oNZB6a --ib Nvme0n1 --bs 65536 --count 1 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@25 -- # local config 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:32.111 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:32.111 "subsystems": [ 00:31:32.111 { 00:31:32.111 "subsystem": "bdev", 00:31:32.111 "config": [ 00:31:32.111 { 00:31:32.111 "method": "bdev_nvme_attach_controller", 00:31:32.111 "params": { 00:31:32.111 "trtype": "tcp", 00:31:32.111 "adrfam": "IPv4", 00:31:32.111 "name": "Nvme0", 00:31:32.111 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:32.111 "traddr": "10.0.0.2", 00:31:32.111 "trsvcid": "4420" 00:31:32.111 } 00:31:32.111 }, 00:31:32.111 { 00:31:32.111 "method": "bdev_set_options", 00:31:32.111 "params": { 00:31:32.111 "bdev_auto_examine": false 00:31:32.111 } 00:31:32.111 } 00:31:32.111 ] 00:31:32.111 } 00:31:32.111 ] 00:31:32.111 }' 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.6nK8oNZB6a --ib Nvme0n1 --bs 65536 --count 1 00:31:32.111 17:43:43 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:32.111 "subsystems": [ 00:31:32.111 { 00:31:32.111 "subsystem": "bdev", 00:31:32.111 "config": [ 00:31:32.111 { 00:31:32.111 "method": "bdev_nvme_attach_controller", 00:31:32.111 "params": { 00:31:32.111 "trtype": "tcp", 00:31:32.111 "adrfam": "IPv4", 00:31:32.111 "name": "Nvme0", 00:31:32.111 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:32.111 "traddr": "10.0.0.2", 00:31:32.111 "trsvcid": "4420" 00:31:32.111 } 00:31:32.111 }, 00:31:32.111 { 00:31:32.111 "method": "bdev_set_options", 00:31:32.111 "params": { 00:31:32.111 "bdev_auto_examine": false 00:31:32.111 } 00:31:32.111 } 00:31:32.111 ] 00:31:32.111 } 00:31:32.111 ] 00:31:32.111 }' 00:31:32.111 [2024-07-15 17:43:43.370343] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:32.111 [2024-07-15 17:43:43.370408] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2979381 ] 00:31:32.370 [2024-07-15 17:43:43.463676] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:32.370 [2024-07-15 17:43:43.556786] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:32.889  Copying: 64/64 [kB] (average 15 MBps) 00:31:32.889 00:31:32.889 17:43:43 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:31:32.889 17:43:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.889 17:43:43 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:32.889 17:43:43 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:32.889 17:43:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:32.889 17:43:43 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:32.889 17:43:43 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:32.889 17:43:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:32.889 17:43:43 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:32.889 17:43:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:32.889 17:43:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:32.889 17:43:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:32.889 17:43:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:32.889 17:43:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:32.889 17:43:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:32.889 17:43:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:32.889 17:43:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:32.889 17:43:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:32.889 17:43:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:32.889 17:43:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.2Ja4iKOcI5 /tmp/tmp.6nK8oNZB6a 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:31:32.889 17:43:44 chaining -- bdev/chaining.sh@25 -- # local config 00:31:33.150 17:43:44 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:33.150 17:43:44 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:33.150 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:33.150 17:43:44 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:33.150 "subsystems": [ 00:31:33.150 { 00:31:33.150 "subsystem": "bdev", 00:31:33.150 "config": [ 00:31:33.150 { 00:31:33.150 "method": "bdev_nvme_attach_controller", 00:31:33.150 "params": { 00:31:33.150 "trtype": "tcp", 00:31:33.150 "adrfam": "IPv4", 00:31:33.150 "name": "Nvme0", 00:31:33.150 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:33.150 "traddr": "10.0.0.2", 00:31:33.150 "trsvcid": "4420" 00:31:33.150 } 00:31:33.150 }, 00:31:33.150 { 00:31:33.150 "method": "bdev_set_options", 00:31:33.150 "params": { 00:31:33.150 "bdev_auto_examine": false 00:31:33.150 } 00:31:33.150 } 00:31:33.150 ] 00:31:33.150 } 00:31:33.150 ] 00:31:33.150 }' 00:31:33.150 17:43:44 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:31:33.150 17:43:44 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:33.150 "subsystems": [ 00:31:33.150 { 00:31:33.150 "subsystem": "bdev", 00:31:33.150 "config": [ 00:31:33.150 { 00:31:33.150 "method": "bdev_nvme_attach_controller", 00:31:33.150 "params": { 00:31:33.150 "trtype": "tcp", 00:31:33.150 "adrfam": "IPv4", 00:31:33.150 "name": "Nvme0", 00:31:33.150 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:33.150 "traddr": "10.0.0.2", 00:31:33.150 "trsvcid": "4420" 00:31:33.150 } 00:31:33.150 }, 00:31:33.150 { 00:31:33.150 "method": "bdev_set_options", 00:31:33.150 "params": { 00:31:33.150 "bdev_auto_examine": false 00:31:33.150 } 00:31:33.150 } 00:31:33.150 ] 00:31:33.150 } 00:31:33.150 ] 00:31:33.150 }' 00:31:33.150 [2024-07-15 17:43:44.292730] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:33.150 [2024-07-15 17:43:44.292800] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2979528 ] 00:31:33.150 [2024-07-15 17:43:44.386043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:33.410 [2024-07-15 17:43:44.479588] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:33.930  Copying: 64/64 [kB] (average 62 MBps) 00:31:33.930 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@106 -- # update_stats 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:33.930 17:43:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:33.930 17:43:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:33.930 17:43:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:33.930 17:43:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:33.930 17:43:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:33.930 17:43:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:33.930 17:43:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:33.930 17:43:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:33.930 17:43:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:33.930 17:43:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:33.931 17:43:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:33.931 17:43:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:33.931 17:43:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:33.931 17:43:45 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:33.931 17:43:45 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.2Ja4iKOcI5 --ob Nvme0n1 --bs 4096 --count 16 00:31:33.931 17:43:45 chaining -- bdev/chaining.sh@25 -- # local config 00:31:33.931 17:43:45 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:33.931 17:43:45 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:33.931 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:34.191 17:43:45 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:34.191 "subsystems": [ 00:31:34.191 { 00:31:34.191 "subsystem": "bdev", 00:31:34.191 "config": [ 00:31:34.191 { 00:31:34.191 "method": "bdev_nvme_attach_controller", 00:31:34.191 "params": { 00:31:34.191 "trtype": "tcp", 00:31:34.191 "adrfam": "IPv4", 00:31:34.191 "name": "Nvme0", 00:31:34.191 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:34.191 "traddr": "10.0.0.2", 00:31:34.191 "trsvcid": "4420" 00:31:34.191 } 00:31:34.191 }, 00:31:34.191 { 00:31:34.191 "method": "bdev_set_options", 00:31:34.191 "params": { 00:31:34.191 "bdev_auto_examine": false 00:31:34.191 } 00:31:34.191 } 00:31:34.191 ] 00:31:34.191 } 00:31:34.191 ] 00:31:34.191 }' 00:31:34.191 17:43:45 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.2Ja4iKOcI5 --ob Nvme0n1 --bs 4096 --count 16 00:31:34.191 17:43:45 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:34.191 "subsystems": [ 00:31:34.191 { 00:31:34.192 "subsystem": "bdev", 00:31:34.192 "config": [ 00:31:34.192 { 00:31:34.192 "method": "bdev_nvme_attach_controller", 00:31:34.192 "params": { 00:31:34.192 "trtype": "tcp", 00:31:34.192 "adrfam": "IPv4", 00:31:34.192 "name": "Nvme0", 00:31:34.192 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:34.192 "traddr": "10.0.0.2", 00:31:34.192 "trsvcid": "4420" 00:31:34.192 } 00:31:34.192 }, 00:31:34.192 { 00:31:34.192 "method": "bdev_set_options", 00:31:34.192 "params": { 00:31:34.192 "bdev_auto_examine": false 00:31:34.192 } 00:31:34.192 } 00:31:34.192 ] 00:31:34.192 } 00:31:34.192 ] 00:31:34.192 }' 00:31:34.192 [2024-07-15 17:43:45.306673] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:34.192 [2024-07-15 17:43:45.306768] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2979739 ] 00:31:34.192 [2024-07-15 17:43:45.402660] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:34.452 [2024-07-15 17:43:45.494412] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:34.712  Copying: 64/64 [kB] (average 12 MBps) 00:31:34.712 00:31:34.712 17:43:45 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:31:34.712 17:43:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:34.712 17:43:45 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:34.712 17:43:45 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:34.712 17:43:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:34.712 17:43:45 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:34.712 17:43:45 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:34.712 17:43:45 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:34.712 17:43:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:34.712 17:43:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:34.712 17:43:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:34.973 17:43:46 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:31:34.973 17:43:46 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:31:34.973 17:43:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:34.973 17:43:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:34.973 17:43:46 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:34.973 17:43:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:34.973 17:43:46 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:34.973 17:43:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:34.973 17:43:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:34.973 17:43:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:34.973 17:43:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:34.973 17:43:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@114 -- # update_stats 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:34.974 17:43:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:34.974 17:43:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:35.234 17:43:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:35.234 17:43:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:35.234 17:43:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:35.234 17:43:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:35.234 17:43:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:35.234 17:43:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:35.235 17:43:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@117 -- # : 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.6nK8oNZB6a --ib Nvme0n1 --bs 4096 --count 16 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@25 -- # local config 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:35.235 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:35.235 "subsystems": [ 00:31:35.235 { 00:31:35.235 "subsystem": "bdev", 00:31:35.235 "config": [ 00:31:35.235 { 00:31:35.235 "method": "bdev_nvme_attach_controller", 00:31:35.235 "params": { 00:31:35.235 "trtype": "tcp", 00:31:35.235 "adrfam": "IPv4", 00:31:35.235 "name": "Nvme0", 00:31:35.235 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:35.235 "traddr": "10.0.0.2", 00:31:35.235 "trsvcid": "4420" 00:31:35.235 } 00:31:35.235 }, 00:31:35.235 { 00:31:35.235 "method": "bdev_set_options", 00:31:35.235 "params": { 00:31:35.235 "bdev_auto_examine": false 00:31:35.235 } 00:31:35.235 } 00:31:35.235 ] 00:31:35.235 } 00:31:35.235 ] 00:31:35.235 }' 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.6nK8oNZB6a --ib Nvme0n1 --bs 4096 --count 16 00:31:35.235 17:43:46 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:35.235 "subsystems": [ 00:31:35.235 { 00:31:35.235 "subsystem": "bdev", 00:31:35.235 "config": [ 00:31:35.235 { 00:31:35.235 "method": "bdev_nvme_attach_controller", 00:31:35.235 "params": { 00:31:35.235 "trtype": "tcp", 00:31:35.235 "adrfam": "IPv4", 00:31:35.235 "name": "Nvme0", 00:31:35.235 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:35.235 "traddr": "10.0.0.2", 00:31:35.235 "trsvcid": "4420" 00:31:35.235 } 00:31:35.235 }, 00:31:35.235 { 00:31:35.235 "method": "bdev_set_options", 00:31:35.235 "params": { 00:31:35.235 "bdev_auto_examine": false 00:31:35.235 } 00:31:35.235 } 00:31:35.235 ] 00:31:35.235 } 00:31:35.235 ] 00:31:35.235 }' 00:31:35.235 [2024-07-15 17:43:46.443197] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:35.235 [2024-07-15 17:43:46.443260] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2979993 ] 00:31:35.495 [2024-07-15 17:43:46.534298] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:35.495 [2024-07-15 17:43:46.628026] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:36.016  Copying: 64/64 [kB] (average 484 kBps) 00:31:36.016 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:36.016 17:43:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:36.016 17:43:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:36.277 17:43:47 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:31:36.277 17:43:47 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.2Ja4iKOcI5 /tmp/tmp.6nK8oNZB6a 00:31:36.277 17:43:47 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:31:36.277 17:43:47 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:31:36.277 17:43:47 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.2Ja4iKOcI5 /tmp/tmp.6nK8oNZB6a 00:31:36.277 17:43:47 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@117 -- # sync 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@120 -- # set +e 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:31:36.277 rmmod nvme_tcp 00:31:36.277 rmmod nvme_fabrics 00:31:36.277 rmmod nvme_keyring 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@124 -- # set -e 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@125 -- # return 0 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@489 -- # '[' -n 2978986 ']' 00:31:36.277 17:43:47 chaining -- nvmf/common.sh@490 -- # killprocess 2978986 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@948 -- # '[' -z 2978986 ']' 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@952 -- # kill -0 2978986 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@953 -- # uname 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2978986 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2978986' 00:31:36.277 killing process with pid 2978986 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@967 -- # kill 2978986 00:31:36.277 17:43:47 chaining -- common/autotest_common.sh@972 -- # wait 2978986 00:31:36.538 17:43:47 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:31:36.538 17:43:47 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:31:36.538 17:43:47 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:31:36.538 17:43:47 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:31:36.538 17:43:47 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:31:36.538 17:43:47 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:36.538 17:43:47 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:36.538 17:43:47 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:38.453 17:43:49 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:31:38.454 17:43:49 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:31:38.454 17:43:49 chaining -- bdev/chaining.sh@132 -- # bperfpid=2980459 00:31:38.454 17:43:49 chaining -- bdev/chaining.sh@134 -- # waitforlisten 2980459 00:31:38.454 17:43:49 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:31:38.454 17:43:49 chaining -- common/autotest_common.sh@829 -- # '[' -z 2980459 ']' 00:31:38.454 17:43:49 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:38.454 17:43:49 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:38.454 17:43:49 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:38.454 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:38.454 17:43:49 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:38.715 17:43:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:38.715 [2024-07-15 17:43:49.801914] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:38.715 [2024-07-15 17:43:49.801980] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2980459 ] 00:31:38.715 [2024-07-15 17:43:49.879494] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:38.715 [2024-07-15 17:43:49.976155] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:39.657 17:43:50 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:39.657 17:43:50 chaining -- common/autotest_common.sh@862 -- # return 0 00:31:39.657 17:43:50 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:31:39.657 17:43:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:39.657 17:43:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:39.657 malloc0 00:31:39.657 true 00:31:39.657 true 00:31:39.657 [2024-07-15 17:43:50.808700] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:39.657 crypto0 00:31:39.657 [2024-07-15 17:43:50.816743] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:31:39.657 crypto1 00:31:39.657 17:43:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:39.657 17:43:50 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:39.657 Running I/O for 5 seconds... 00:31:44.938 00:31:44.938 Latency(us) 00:31:44.938 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:44.938 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:31:44.938 Verification LBA range: start 0x0 length 0x2000 00:31:44.938 crypto1 : 5.01 14275.63 55.76 0.00 0.00 17883.72 1676.21 11494.01 00:31:44.938 =================================================================================================================== 00:31:44.938 Total : 14275.63 55.76 0.00 0.00 17883.72 1676.21 11494.01 00:31:44.938 0 00:31:44.938 17:43:55 chaining -- bdev/chaining.sh@146 -- # killprocess 2980459 00:31:44.938 17:43:55 chaining -- common/autotest_common.sh@948 -- # '[' -z 2980459 ']' 00:31:44.938 17:43:55 chaining -- common/autotest_common.sh@952 -- # kill -0 2980459 00:31:44.938 17:43:55 chaining -- common/autotest_common.sh@953 -- # uname 00:31:44.938 17:43:55 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:44.938 17:43:55 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2980459 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2980459' 00:31:44.938 killing process with pid 2980459 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@967 -- # kill 2980459 00:31:44.938 Received shutdown signal, test time was about 5.000000 seconds 00:31:44.938 00:31:44.938 Latency(us) 00:31:44.938 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:44.938 =================================================================================================================== 00:31:44.938 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@972 -- # wait 2980459 00:31:44.938 17:43:56 chaining -- bdev/chaining.sh@152 -- # bperfpid=2981655 00:31:44.938 17:43:56 chaining -- bdev/chaining.sh@154 -- # waitforlisten 2981655 00:31:44.938 17:43:56 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@829 -- # '[' -z 2981655 ']' 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:44.938 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:44.938 17:43:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:44.938 [2024-07-15 17:43:56.195057] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:44.938 [2024-07-15 17:43:56.195109] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2981655 ] 00:31:45.197 [2024-07-15 17:43:56.282007] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:45.197 [2024-07-15 17:43:56.353717] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:45.766 17:43:57 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:45.766 17:43:57 chaining -- common/autotest_common.sh@862 -- # return 0 00:31:45.766 17:43:57 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:31:45.766 17:43:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:45.766 17:43:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:46.026 malloc0 00:31:46.026 true 00:31:46.026 true 00:31:46.026 [2024-07-15 17:43:57.163274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:31:46.026 [2024-07-15 17:43:57.163312] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:46.026 [2024-07-15 17:43:57.163324] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15ceae0 00:31:46.026 [2024-07-15 17:43:57.163330] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:46.026 [2024-07-15 17:43:57.164199] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:46.026 [2024-07-15 17:43:57.164216] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:31:46.026 pt0 00:31:46.026 [2024-07-15 17:43:57.171302] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:46.026 crypto0 00:31:46.026 [2024-07-15 17:43:57.179320] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:31:46.026 crypto1 00:31:46.026 17:43:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:46.026 17:43:57 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:46.026 Running I/O for 5 seconds... 00:31:51.337 00:31:51.337 Latency(us) 00:31:51.337 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:51.337 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:31:51.337 Verification LBA range: start 0x0 length 0x2000 00:31:51.337 crypto1 : 5.02 11072.53 43.25 0.00 0.00 23065.27 5217.67 14014.62 00:31:51.338 =================================================================================================================== 00:31:51.338 Total : 11072.53 43.25 0.00 0.00 23065.27 5217.67 14014.62 00:31:51.338 0 00:31:51.338 17:44:02 chaining -- bdev/chaining.sh@167 -- # killprocess 2981655 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@948 -- # '[' -z 2981655 ']' 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@952 -- # kill -0 2981655 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@953 -- # uname 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2981655 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2981655' 00:31:51.338 killing process with pid 2981655 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@967 -- # kill 2981655 00:31:51.338 Received shutdown signal, test time was about 5.000000 seconds 00:31:51.338 00:31:51.338 Latency(us) 00:31:51.338 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:51.338 =================================================================================================================== 00:31:51.338 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@972 -- # wait 2981655 00:31:51.338 17:44:02 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:31:51.338 17:44:02 chaining -- bdev/chaining.sh@170 -- # killprocess 2981655 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@948 -- # '[' -z 2981655 ']' 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@952 -- # kill -0 2981655 00:31:51.338 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (2981655) - No such process 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 2981655 is not found' 00:31:51.338 Process with pid 2981655 is not found 00:31:51.338 17:44:02 chaining -- bdev/chaining.sh@171 -- # wait 2981655 00:31:51.338 17:44:02 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:31:51.338 17:44:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@296 -- # e810=() 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@297 -- # x722=() 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@298 -- # mlx=() 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:31:51.338 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:31:51.338 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:31:51.338 Found net devices under 0000:4b:00.0: cvl_0_0 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:31:51.338 Found net devices under 0000:4b:00.1: cvl_0_1 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:31:51.338 17:44:02 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:31:51.639 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:31:51.639 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.753 ms 00:31:51.639 00:31:51.639 --- 10.0.0.2 ping statistics --- 00:31:51.639 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:51.639 rtt min/avg/max/mdev = 0.753/0.753/0.753/0.000 ms 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:31:51.639 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:31:51.639 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.306 ms 00:31:51.639 00:31:51.639 --- 10.0.0.1 ping statistics --- 00:31:51.639 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:51.639 rtt min/avg/max/mdev = 0.306/0.306/0.306/0.000 ms 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@422 -- # return 0 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:31:51.639 17:44:02 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:31:51.639 17:44:02 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:31:51.639 17:44:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@481 -- # nvmfpid=2982622 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@482 -- # waitforlisten 2982622 00:31:51.639 17:44:02 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:31:51.639 17:44:02 chaining -- common/autotest_common.sh@829 -- # '[' -z 2982622 ']' 00:31:51.639 17:44:02 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:51.639 17:44:02 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:51.639 17:44:02 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:51.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:51.639 17:44:02 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:51.639 17:44:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:51.900 [2024-07-15 17:44:02.935796] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:51.900 [2024-07-15 17:44:02.935855] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:51.900 [2024-07-15 17:44:03.023922] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:51.900 [2024-07-15 17:44:03.121121] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:31:51.900 [2024-07-15 17:44:03.121181] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:31:51.900 [2024-07-15 17:44:03.121190] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:31:51.900 [2024-07-15 17:44:03.121199] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:31:51.900 [2024-07-15 17:44:03.121206] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:31:51.900 [2024-07-15 17:44:03.121233] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@862 -- # return 0 00:31:52.840 17:44:03 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:52.840 17:44:03 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:31:52.840 17:44:03 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:52.840 malloc0 00:31:52.840 [2024-07-15 17:44:03.836104] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:31:52.840 [2024-07-15 17:44:03.852312] tcp.c: 981:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:52.840 17:44:03 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:31:52.840 17:44:03 chaining -- bdev/chaining.sh@189 -- # bperfpid=2982926 00:31:52.840 17:44:03 chaining -- bdev/chaining.sh@191 -- # waitforlisten 2982926 /var/tmp/bperf.sock 00:31:52.840 17:44:03 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@829 -- # '[' -z 2982926 ']' 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:31:52.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:52.840 17:44:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:52.840 [2024-07-15 17:44:03.921177] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:31:52.840 [2024-07-15 17:44:03.921234] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2982926 ] 00:31:52.840 [2024-07-15 17:44:04.012621] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:52.840 [2024-07-15 17:44:04.080873] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:53.779 17:44:04 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:53.779 17:44:04 chaining -- common/autotest_common.sh@862 -- # return 0 00:31:53.779 17:44:04 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:31:53.779 17:44:04 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:31:54.039 [2024-07-15 17:44:05.094300] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:54.039 nvme0n1 00:31:54.039 true 00:31:54.039 crypto0 00:31:54.039 17:44:05 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:31:54.039 Running I/O for 5 seconds... 00:31:59.325 00:31:59.325 Latency(us) 00:31:59.325 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:59.325 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:31:59.325 Verification LBA range: start 0x0 length 0x2000 00:31:59.325 crypto0 : 5.02 8810.26 34.42 0.00 0.00 28971.12 4562.31 23592.96 00:31:59.325 =================================================================================================================== 00:31:59.325 Total : 8810.26 34.42 0.00 0.00 28971.12 4562.31 23592.96 00:31:59.325 0 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@205 -- # sequence=88508 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:59.325 17:44:10 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@206 -- # encrypt=44254 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:59.585 17:44:10 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@207 -- # decrypt=44254 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:31:59.845 17:44:10 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:59.845 17:44:11 chaining -- bdev/chaining.sh@208 -- # crc32c=88508 00:31:59.845 17:44:11 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:31:59.845 17:44:11 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:31:59.845 17:44:11 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:31:59.845 17:44:11 chaining -- bdev/chaining.sh@214 -- # killprocess 2982926 00:31:59.845 17:44:11 chaining -- common/autotest_common.sh@948 -- # '[' -z 2982926 ']' 00:31:59.845 17:44:11 chaining -- common/autotest_common.sh@952 -- # kill -0 2982926 00:31:59.845 17:44:11 chaining -- common/autotest_common.sh@953 -- # uname 00:31:59.845 17:44:11 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2982926 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2982926' 00:32:00.105 killing process with pid 2982926 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@967 -- # kill 2982926 00:32:00.105 Received shutdown signal, test time was about 5.000000 seconds 00:32:00.105 00:32:00.105 Latency(us) 00:32:00.105 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:00.105 =================================================================================================================== 00:32:00.105 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@972 -- # wait 2982926 00:32:00.105 17:44:11 chaining -- bdev/chaining.sh@219 -- # bperfpid=2984159 00:32:00.105 17:44:11 chaining -- bdev/chaining.sh@221 -- # waitforlisten 2984159 /var/tmp/bperf.sock 00:32:00.105 17:44:11 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@829 -- # '[' -z 2984159 ']' 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:32:00.105 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:00.105 17:44:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:00.105 [2024-07-15 17:44:11.366741] Starting SPDK v24.09-pre git sha1 248c547d0 / DPDK 24.03.0 initialization... 00:32:00.105 [2024-07-15 17:44:11.366792] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2984159 ] 00:32:00.365 [2024-07-15 17:44:11.455610] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:00.365 [2024-07-15 17:44:11.526789] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:00.934 17:44:12 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:00.934 17:44:12 chaining -- common/autotest_common.sh@862 -- # return 0 00:32:00.934 17:44:12 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:32:00.934 17:44:12 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:32:01.504 [2024-07-15 17:44:12.563736] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:01.504 nvme0n1 00:32:01.504 true 00:32:01.504 crypto0 00:32:01.504 17:44:12 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:32:01.504 Running I/O for 5 seconds... 00:32:06.786 00:32:06.786 Latency(us) 00:32:06.786 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:06.786 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:32:06.786 Verification LBA range: start 0x0 length 0x200 00:32:06.786 crypto0 : 5.01 2230.16 139.39 0.00 0.00 14052.49 1235.10 17140.18 00:32:06.786 =================================================================================================================== 00:32:06.786 Total : 2230.16 139.39 0.00 0.00 14052.49 1235.10 17140.18 00:32:06.786 0 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@233 -- # sequence=22338 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:06.786 17:44:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@234 -- # encrypt=11169 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:07.046 17:44:18 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@235 -- # decrypt=11169 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@236 -- # crc32c=22338 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:32:07.307 17:44:18 chaining -- bdev/chaining.sh@242 -- # killprocess 2984159 00:32:07.307 17:44:18 chaining -- common/autotest_common.sh@948 -- # '[' -z 2984159 ']' 00:32:07.307 17:44:18 chaining -- common/autotest_common.sh@952 -- # kill -0 2984159 00:32:07.307 17:44:18 chaining -- common/autotest_common.sh@953 -- # uname 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2984159 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2984159' 00:32:07.567 killing process with pid 2984159 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@967 -- # kill 2984159 00:32:07.567 Received shutdown signal, test time was about 5.000000 seconds 00:32:07.567 00:32:07.567 Latency(us) 00:32:07.567 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:07.567 =================================================================================================================== 00:32:07.567 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@972 -- # wait 2984159 00:32:07.567 17:44:18 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@117 -- # sync 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@120 -- # set +e 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:32:07.567 rmmod nvme_tcp 00:32:07.567 rmmod nvme_fabrics 00:32:07.567 rmmod nvme_keyring 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@124 -- # set -e 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@125 -- # return 0 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@489 -- # '[' -n 2982622 ']' 00:32:07.567 17:44:18 chaining -- nvmf/common.sh@490 -- # killprocess 2982622 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@948 -- # '[' -z 2982622 ']' 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@952 -- # kill -0 2982622 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@953 -- # uname 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:07.567 17:44:18 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2982622 00:32:07.827 17:44:18 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:07.827 17:44:18 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:07.827 17:44:18 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2982622' 00:32:07.827 killing process with pid 2982622 00:32:07.827 17:44:18 chaining -- common/autotest_common.sh@967 -- # kill 2982622 00:32:07.827 17:44:18 chaining -- common/autotest_common.sh@972 -- # wait 2982622 00:32:07.827 17:44:19 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:32:07.827 17:44:19 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:32:07.827 17:44:19 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:32:07.827 17:44:19 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:32:07.827 17:44:19 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:32:07.827 17:44:19 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:32:07.827 17:44:19 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:32:07.827 17:44:19 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:32:10.369 17:44:21 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:32:10.369 17:44:21 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:32:10.369 00:32:10.369 real 0m48.931s 00:32:10.369 user 0m58.780s 00:32:10.369 sys 0m11.756s 00:32:10.369 17:44:21 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:10.369 17:44:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:10.369 ************************************ 00:32:10.369 END TEST chaining 00:32:10.369 ************************************ 00:32:10.369 17:44:21 -- common/autotest_common.sh@1142 -- # return 0 00:32:10.369 17:44:21 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:32:10.369 17:44:21 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:32:10.369 17:44:21 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:32:10.369 17:44:21 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:32:10.369 17:44:21 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:32:10.369 17:44:21 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:32:10.369 17:44:21 -- common/autotest_common.sh@722 -- # xtrace_disable 00:32:10.369 17:44:21 -- common/autotest_common.sh@10 -- # set +x 00:32:10.369 17:44:21 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:32:10.369 17:44:21 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:32:10.369 17:44:21 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:32:10.369 17:44:21 -- common/autotest_common.sh@10 -- # set +x 00:32:16.948 INFO: APP EXITING 00:32:16.948 INFO: killing all VMs 00:32:16.948 INFO: killing vhost app 00:32:16.948 INFO: EXIT DONE 00:32:20.282 Waiting for block devices as requested 00:32:20.282 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:32:20.282 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:32:20.282 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:32:20.282 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:32:20.282 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:32:20.615 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:32:20.615 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:32:20.615 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:32:20.615 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:32:20.874 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:32:20.874 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:32:20.874 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:32:21.133 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:32:21.133 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:32:21.133 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:32:21.392 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:32:21.392 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:32:26.678 Cleaning 00:32:26.679 Removing: /var/run/dpdk/spdk0/config 00:32:26.679 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:32:26.679 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:32:26.679 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:32:26.679 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:32:26.679 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:32:26.679 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:32:26.679 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:32:26.679 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:32:26.679 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:32:26.679 Removing: /var/run/dpdk/spdk0/hugepage_info 00:32:26.679 Removing: /dev/shm/nvmf_trace.0 00:32:26.679 Removing: /dev/shm/spdk_tgt_trace.pid2692807 00:32:26.679 Removing: /var/run/dpdk/spdk0 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2687776 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2690147 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2692807 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2693429 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2694373 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2694664 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2695636 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2695855 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2696055 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2699883 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2701832 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2702185 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2702552 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2702927 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2703281 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2703409 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2703647 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2703990 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2704984 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2708254 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2708518 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2708687 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2708988 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2709042 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2709365 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2709548 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2709740 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2710040 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2710356 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2710617 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2710736 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2711040 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2711358 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2711675 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2711841 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2712049 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2712356 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2712673 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2712926 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2713052 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2713349 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2713671 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2713998 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2714165 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2714367 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2714671 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2714994 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2715320 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2715633 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2715960 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2716289 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2716618 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2716938 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2717023 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2717395 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2717835 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2718160 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2718457 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2722884 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2725198 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2727654 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2728803 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2730132 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2730459 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2730502 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2730654 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2735325 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2735990 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2737179 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2737409 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2743386 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2745127 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2746129 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2750900 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2752539 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2753518 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2758019 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2761257 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2762181 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2772528 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2774946 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2775962 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2786210 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2788561 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2789603 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2800888 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2804948 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2806063 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2817586 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2820091 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2821382 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2833481 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2836226 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2837273 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2848858 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2852996 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2854216 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2855333 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2858659 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2865158 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2868461 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2873319 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2877713 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2884211 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2887284 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2894135 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2897167 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2903949 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2906411 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2913244 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2915717 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2920717 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2921040 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2921448 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2921965 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2922395 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2923334 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2924095 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2924576 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2926669 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2928937 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2931540 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2933160 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2935612 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2937610 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2939659 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2941494 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2942169 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2942521 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2945047 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2947494 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2949786 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2951022 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2952552 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2953183 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2953207 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2953295 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2953599 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2953773 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2955221 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2957061 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2959129 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2960063 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2961455 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2961769 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2961811 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2961922 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2963096 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2963728 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2964346 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2966485 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2968949 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2971277 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2972551 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2974074 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2974695 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2974725 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2979170 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2979381 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2979528 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2979739 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2979993 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2980459 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2981655 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2982926 00:32:26.679 Removing: /var/run/dpdk/spdk_pid2984159 00:32:26.679 Clean 00:32:26.679 17:44:37 -- common/autotest_common.sh@1451 -- # return 0 00:32:26.679 17:44:37 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:32:26.679 17:44:37 -- common/autotest_common.sh@728 -- # xtrace_disable 00:32:26.679 17:44:37 -- common/autotest_common.sh@10 -- # set +x 00:32:26.679 17:44:37 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:32:26.679 17:44:37 -- common/autotest_common.sh@728 -- # xtrace_disable 00:32:26.679 17:44:37 -- common/autotest_common.sh@10 -- # set +x 00:32:26.679 17:44:37 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:32:26.679 17:44:37 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:32:26.679 17:44:37 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:32:26.679 17:44:37 -- spdk/autotest.sh@391 -- # hash lcov 00:32:26.679 17:44:37 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:32:26.679 17:44:37 -- spdk/autotest.sh@393 -- # hostname 00:32:26.679 17:44:37 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-CYP-06 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:32:26.940 geninfo: WARNING: invalid characters removed from testname! 00:32:48.906 17:44:59 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:51.473 17:45:02 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:54.027 17:45:04 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:55.933 17:45:06 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:57.871 17:45:08 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:59.777 17:45:10 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:01.686 17:45:12 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:33:01.946 17:45:12 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:33:01.946 17:45:12 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:33:01.946 17:45:12 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:33:01.946 17:45:12 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:33:01.947 17:45:12 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:01.947 17:45:12 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:01.947 17:45:12 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:01.947 17:45:12 -- paths/export.sh@5 -- $ export PATH 00:33:01.947 17:45:12 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:01.947 17:45:12 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:01.947 17:45:12 -- common/autobuild_common.sh@444 -- $ date +%s 00:33:01.947 17:45:13 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721058313.XXXXXX 00:33:01.947 17:45:13 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721058313.7llmoV 00:33:01.947 17:45:13 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:33:01.947 17:45:13 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:33:01.947 17:45:13 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:33:01.947 17:45:13 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:33:01.947 17:45:13 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:33:01.947 17:45:13 -- common/autobuild_common.sh@460 -- $ get_config_params 00:33:01.947 17:45:13 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:33:01.947 17:45:13 -- common/autotest_common.sh@10 -- $ set +x 00:33:01.947 17:45:13 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:33:01.947 17:45:13 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:33:01.947 17:45:13 -- pm/common@17 -- $ local monitor 00:33:01.947 17:45:13 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:01.947 17:45:13 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:01.947 17:45:13 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:01.947 17:45:13 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:01.947 17:45:13 -- pm/common@21 -- $ date +%s 00:33:01.947 17:45:13 -- pm/common@21 -- $ date +%s 00:33:01.947 17:45:13 -- pm/common@25 -- $ sleep 1 00:33:01.947 17:45:13 -- pm/common@21 -- $ date +%s 00:33:01.947 17:45:13 -- pm/common@21 -- $ date +%s 00:33:01.947 17:45:13 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721058313 00:33:01.947 17:45:13 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721058313 00:33:01.947 17:45:13 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721058313 00:33:01.947 17:45:13 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721058313 00:33:01.947 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721058313_collect-vmstat.pm.log 00:33:01.947 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721058313_collect-cpu-load.pm.log 00:33:01.947 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721058313_collect-cpu-temp.pm.log 00:33:01.947 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721058313_collect-bmc-pm.bmc.pm.log 00:33:02.917 17:45:14 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:33:02.917 17:45:14 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j128 00:33:02.917 17:45:14 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:02.917 17:45:14 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:33:02.917 17:45:14 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:33:02.917 17:45:14 -- spdk/autopackage.sh@19 -- $ timing_finish 00:33:02.917 17:45:14 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:33:02.917 17:45:14 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:33:02.917 17:45:14 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:33:02.917 17:45:14 -- spdk/autopackage.sh@20 -- $ exit 0 00:33:02.917 17:45:14 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:33:02.917 17:45:14 -- pm/common@29 -- $ signal_monitor_resources TERM 00:33:02.917 17:45:14 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:33:02.917 17:45:14 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:02.917 17:45:14 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:33:02.917 17:45:14 -- pm/common@44 -- $ pid=2997604 00:33:02.917 17:45:14 -- pm/common@50 -- $ kill -TERM 2997604 00:33:02.917 17:45:14 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:02.917 17:45:14 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:33:02.917 17:45:14 -- pm/common@44 -- $ pid=2997605 00:33:02.917 17:45:14 -- pm/common@50 -- $ kill -TERM 2997605 00:33:02.917 17:45:14 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:02.917 17:45:14 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:33:02.917 17:45:14 -- pm/common@44 -- $ pid=2997607 00:33:02.917 17:45:14 -- pm/common@50 -- $ kill -TERM 2997607 00:33:02.917 17:45:14 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:02.917 17:45:14 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:33:02.917 17:45:14 -- pm/common@44 -- $ pid=2997630 00:33:02.917 17:45:14 -- pm/common@50 -- $ sudo -E kill -TERM 2997630 00:33:02.917 + [[ -n 2562801 ]] 00:33:02.917 + sudo kill 2562801 00:33:02.928 [Pipeline] } 00:33:02.952 [Pipeline] // stage 00:33:02.958 [Pipeline] } 00:33:02.976 [Pipeline] // timeout 00:33:02.981 [Pipeline] } 00:33:02.994 [Pipeline] // catchError 00:33:03.000 [Pipeline] } 00:33:03.018 [Pipeline] // wrap 00:33:03.026 [Pipeline] } 00:33:03.041 [Pipeline] // catchError 00:33:03.050 [Pipeline] stage 00:33:03.052 [Pipeline] { (Epilogue) 00:33:03.066 [Pipeline] catchError 00:33:03.068 [Pipeline] { 00:33:03.082 [Pipeline] echo 00:33:03.084 Cleanup processes 00:33:03.090 [Pipeline] sh 00:33:03.376 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:03.376 2997711 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:33:03.376 2998103 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:03.394 [Pipeline] sh 00:33:03.685 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:03.685 ++ grep -v 'sudo pgrep' 00:33:03.685 ++ awk '{print $1}' 00:33:03.685 + sudo kill -9 2997711 00:33:03.697 [Pipeline] sh 00:33:03.983 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:33:16.447 [Pipeline] sh 00:33:16.732 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:33:16.732 Artifacts sizes are good 00:33:16.746 [Pipeline] archiveArtifacts 00:33:16.753 Archiving artifacts 00:33:16.921 [Pipeline] sh 00:33:17.206 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:33:17.222 [Pipeline] cleanWs 00:33:17.232 [WS-CLEANUP] Deleting project workspace... 00:33:17.232 [WS-CLEANUP] Deferred wipeout is used... 00:33:17.240 [WS-CLEANUP] done 00:33:17.242 [Pipeline] } 00:33:17.263 [Pipeline] // catchError 00:33:17.276 [Pipeline] sh 00:33:17.562 + logger -p user.info -t JENKINS-CI 00:33:17.571 [Pipeline] } 00:33:17.592 [Pipeline] // stage 00:33:17.625 [Pipeline] } 00:33:17.646 [Pipeline] // node 00:33:17.652 [Pipeline] End of Pipeline 00:33:17.684 Finished: SUCCESS